ansible-galaxy 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-galaxy python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: The specified collections path '/tmp/tmp9ez87j1u' is not part of the configured Ansible collections paths '/root/.ansible/collections:/usr/share/ansible/collections'. The installed collection won't be picked up in an Ansible run. Process install dependency map |/-\|/-\|/-\|/-\|/-\Starting collection install process |Installing 'ansible.posix:1.4.0' to '/tmp/tmp9ez87j1u/ansible_collections/ansible/posix' /-\ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [ERROR]: GNU xorriso 1.4.8 : RockRidge filesystem manipulator, libburnia project. xorriso : NOTE : Local character set is now assumed as: 'utf-8' [WARNI] standard-inventory-qcow2: NVMe drive of size '10737418240' will be skipped - no NVMe support on this platform [WARNI] standard-inventory-qcow2: NVMe drive of size '10737418240' will be skipped - no NVMe support on this platform [WARNI] standard-inventory-qcow2: NVMe drive of size '10737418240' will be skipped - no NVMe support on this platform Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:20:23 +0000 (0:00:00.024) 0:00:00.024 ******** changed: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:20:24 +0000 (0:00:01.331) 0:00:01.355 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_disk_fs.yml ********************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:2 Wednesday 01 June 2022 16:20:24 +0000 (0:00:00.013) 0:00:01.369 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:11 Wednesday 01 June 2022 16:20:25 +0000 (0:00:01.032) 0:00:02.401 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:20:25 +0000 (0:00:00.039) 0:00:02.440 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:20:26 +0000 (0:00:00.153) 0:00:02.594 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:20:26 +0000 (0:00:00.519) 0:00:03.114 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:20:26 +0000 (0:00:00.080) 0:00:03.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:20:26 +0000 (0:00:00.021) 0:00:03.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:20:26 +0000 (0:00:00.022) 0:00:03.238 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:20:26 +0000 (0:00:00.192) 0:00:03.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:20:26 +0000 (0:00:00.018) 0:00:03.449 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "rc": 0, "results": [ "Installed: libblockdev-2.25-12.el9.x86_64", "Installed: libblockdev-crypto-2.25-12.el9.x86_64", "Installed: vdo-8.1.1.360-1.el9.x86_64", "Installed: libblockdev-dm-2.25-12.el9.x86_64", "Installed: libblockdev-fs-2.25-12.el9.x86_64", "Installed: libblockdev-kbd-2.25-12.el9.x86_64", "Installed: libblockdev-loop-2.25-12.el9.x86_64", "Installed: libblockdev-lvm-2.25-12.el9.x86_64", "Installed: libblockdev-mdraid-2.25-12.el9.x86_64", "Installed: libblockdev-mpath-2.25-12.el9.x86_64", "Installed: daxctl-libs-71.1-6.el9.x86_64", "Installed: libblockdev-nvdimm-2.25-12.el9.x86_64", "Installed: libblockdev-part-2.25-12.el9.x86_64", "Installed: libblockdev-swap-2.25-12.el9.x86_64", "Installed: libblockdev-utils-2.25-12.el9.x86_64", "Installed: device-mapper-event-9:1.02.183-4.el9.x86_64", "Installed: device-mapper-event-libs-9:1.02.183-4.el9.x86_64", "Installed: libbytesize-2.5-3.el9.x86_64", "Installed: kmod-kvdo-8.1.1.371-24.el9_0.x86_64", "Installed: device-mapper-multipath-0.8.7-9.el9.x86_64", "Installed: device-mapper-multipath-libs-0.8.7-9.el9.x86_64", "Installed: device-mapper-persistent-data-0.9.0-12.el9.x86_64", "Installed: volume_key-libs-0.3.12-15.el9.x86_64", "Installed: libaio-0.3.111-13.el9.x86_64", "Installed: lsof-4.94.0-3.el9.x86_64", "Installed: lvm2-9:2.03.14-4.el9.x86_64", "Installed: lvm2-libs-9:2.03.14-4.el9.x86_64", "Installed: mdadm-4.2-2.el9.x86_64", "Installed: nspr-4.32.0-9.el9.x86_64", "Installed: python3-pyparted-1:3.11.7-4.el9.x86_64", "Installed: nss-3.71.0-7.el9.x86_64", "Installed: ndctl-71.1-6.el9.x86_64", "Installed: ndctl-libs-71.1-6.el9.x86_64", "Installed: nss-softokn-3.71.0-7.el9.x86_64", "Installed: nss-softokn-freebl-3.71.0-7.el9.x86_64", "Installed: nss-sysinit-3.71.0-7.el9.x86_64", "Installed: nss-util-3.71.0-7.el9.x86_64", "Installed: python3-blivet-1:3.4.0-13.el9_0.noarch", "Installed: python3-blockdev-2.25-12.el9.x86_64", "Installed: blivet-data-1:3.4.0-13.el9_0.noarch", "Installed: python3-bytesize-2.5-3.el9.x86_64" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:20:41 +0000 (0:00:14.261) 0:00:17.711 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:20:41 +0000 (0:00:00.046) 0:00:17.758 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:20:41 +0000 (0:00:00.050) 0:00:17.808 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:20:42 +0000 (0:00:00.688) 0:00:18.496 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:20:42 +0000 (0:00:00.081) 0:00:18.578 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:20:42 +0000 (0:00:00.022) 0:00:18.600 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:20:42 +0000 (0:00:00.022) 0:00:18.623 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:20:42 +0000 (0:00:00.020) 0:00:18.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:20:43 +0000 (0:00:00.873) 0:00:19.517 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "run-r738c5c88f21e4ab5ab56e158e1c38931.service": { "name": "run-r738c5c88f21e4ab5ab56e158e1c38931.service", "source": "systemd", "state": "running", "status": "transient" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:20:44 +0000 (0:00:01.925) 0:00:21.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:20:44 +0000 (0:00:00.041) 0:00:21.484 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.026) 0:00:21.511 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.568) 0:00:22.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.030) 0:00:22.109 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.027) 0:00:22.137 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.031) 0:00:22.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.032) 0:00:22.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.033) 0:00:22.235 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.027) 0:00:22.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.027) 0:00:22.290 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.028) 0:00:22.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:20:45 +0000 (0:00:00.028) 0:00:22.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:20:46 +0000 (0:00:00.458) 0:00:22.806 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:20:46 +0000 (0:00:00.028) 0:00:22.834 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:14 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.878) 0:00:23.712 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:21 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.030) 0:00:23.743 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.043) 0:00:23.787 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.509) 0:00:24.296 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.035) 0:00:24.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.032) 0:00:24.364 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk device with the default file system type] ****************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:26 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.036) 0:00:24.401 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:20:47 +0000 (0:00:00.054) 0:00:24.456 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.041) 0:00:24.498 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.501) 0:00:24.999 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.073) 0:00:25.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.030) 0:00:25.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.030) 0:00:25.133 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.094) 0:00:25.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.025) 0:00:25.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.028) 0:00:25.282 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.031) 0:00:25.313 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.035) 0:00:25.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.033) 0:00:25.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.029) 0:00:25.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.027) 0:00:25.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:20:48 +0000 (0:00:00.028) 0:00:25.467 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:20:49 +0000 (0:00:00.040) 0:00:25.507 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:20:49 +0000 (0:00:00.026) 0:00:25.534 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:20:50 +0000 (0:00:01.297) 0:00:26.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:20:50 +0000 (0:00:00.030) 0:00:26.863 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:20:50 +0000 (0:00:00.028) 0:00:26.891 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:20:50 +0000 (0:00:00.036) 0:00:26.928 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:20:50 +0000 (0:00:00.031) 0:00:26.960 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:20:50 +0000 (0:00:00.033) 0:00:26.994 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:20:50 +0000 (0:00:00.030) 0:00:27.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:20:51 +0000 (0:00:00.937) 0:00:27.961 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:20:52 +0000 (0:00:00.567) 0:00:28.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:20:52 +0000 (0:00:00.679) 0:00:29.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:20:53 +0000 (0:00:00.374) 0:00:29.583 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:20:53 +0000 (0:00:00.031) 0:00:29.614 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:36 Wednesday 01 June 2022 16:20:53 +0000 (0:00:00.855) 0:00:30.470 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:20:54 +0000 (0:00:00.051) 0:00:30.521 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:20:54 +0000 (0:00:00.029) 0:00:30.550 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:20:54 +0000 (0:00:00.037) 0:00:30.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7f6e6a9d-ec07-486e-8a41-10b14a7b79f0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:20:54 +0000 (0:00:00.500) 0:00:31.089 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003070", "end": "2022-06-01 12:20:54.549712", "rc": 0, "start": "2022-06-01 12:20:54.546642" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0 /opt/test xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.480) 0:00:31.569 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002575", "end": "2022-06-01 12:20:54.929475", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:20:54.926900" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.380) 0:00:31.950 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.029) 0:00:31.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.030) 0:00:32.010 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.061) 0:00:32.071 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.037) 0:00:32.108 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.114) 0:00:32.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.033) 0:00:32.256 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "7f6e6a9d-ec07-486e-8a41-10b14a7b79f0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "7f6e6a9d-ec07-486e-8a41-10b14a7b79f0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.042) 0:00:32.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.036) 0:00:32.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.034) 0:00:32.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.037) 0:00:32.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.032) 0:00:32.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:20:55 +0000 (0:00:00.030) 0:00:32.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.029) 0:00:32.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.030) 0:00:32.529 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.044) 0:00:32.574 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.032) 0:00:32.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.037) 0:00:32.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.027) 0:00:32.672 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.027) 0:00:32.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.033) 0:00:32.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.037) 0:00:32.770 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100449.7601216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100449.7601216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100449.7601216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.408) 0:00:33.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.034) 0:00:33.213 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.037) 0:00:33.250 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.033) 0:00:33.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.030) 0:00:33.314 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.033) 0:00:33.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.029) 0:00:33.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.028) 0:00:33.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.031) 0:00:33.436 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:20:56 +0000 (0:00:00.035) 0:00:33.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:33.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:33.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:33.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:33.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.031) 0:00:33.620 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.073) 0:00:33.694 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.033) 0:00:33.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.028) 0:00:33.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:33.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.028) 0:00:33.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.028) 0:00:33.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:33.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.027) 0:00:33.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.028) 0:00:33.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.028) 0:00:33.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.028) 0:00:33.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:34.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.030) 0:00:34.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.033) 0:00:34.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.032) 0:00:34.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.031) 0:00:34.142 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.033) 0:00:34.175 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.028) 0:00:34.204 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.033) 0:00:34.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.030) 0:00:34.268 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.029) 0:00:34.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.030) 0:00:34.328 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.037) 0:00:34.365 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.035) 0:00:34.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.032) 0:00:34.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:20:57 +0000 (0:00:00.027) 0:00:34.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.027) 0:00:34.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.027) 0:00:34.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.026) 0:00:34.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.026) 0:00:34.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.027) 0:00:34.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.028) 0:00:34.625 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.029) 0:00:34.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the disk device file system type to "ext4"] *********************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:38 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.033) 0:00:34.687 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.065) 0:00:34.753 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.042) 0:00:34.795 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.506) 0:00:35.302 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.076) 0:00:35.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.031) 0:00:35.411 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:20:58 +0000 (0:00:00.030) 0:00:35.441 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.059) 0:00:35.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.027) 0:00:35.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.029) 0:00:35.558 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.031) 0:00:35.590 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.036) 0:00:35.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.028) 0:00:35.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.028) 0:00:35.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.030) 0:00:35.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.028) 0:00:35.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.043) 0:00:35.786 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:20:59 +0000 (0:00:00.026) 0:00:35.813 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:21:00 +0000 (0:00:01.470) 0:00:37.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:21:00 +0000 (0:00:00.030) 0:00:37.313 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:21:00 +0000 (0:00:00.029) 0:00:37.342 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:21:00 +0000 (0:00:00.037) 0:00:37.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:21:00 +0000 (0:00:00.033) 0:00:37.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:21:00 +0000 (0:00:00.035) 0:00:37.448 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=7f6e6a9d-ec07-486e-8a41-10b14a7b79f0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:21:01 +0000 (0:00:00.380) 0:00:37.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:21:01 +0000 (0:00:00.655) 0:00:38.484 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=1851109a-281e-4940-b710-33661f63dc74', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:21:02 +0000 (0:00:00.479) 0:00:38.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:21:03 +0000 (0:00:00.680) 0:00:39.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:21:03 +0000 (0:00:00.360) 0:00:40.004 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:21:03 +0000 (0:00:00.029) 0:00:40.034 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:49 Wednesday 01 June 2022 16:21:04 +0000 (0:00:00.803) 0:00:40.838 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:21:04 +0000 (0:00:00.054) 0:00:40.892 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:21:04 +0000 (0:00:00.028) 0:00:40.921 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:21:04 +0000 (0:00:00.036) 0:00:40.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1851109a-281e-4940-b710-33661f63dc74" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:21:04 +0000 (0:00:00.389) 0:00:41.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003495", "end": "2022-06-01 12:21:04.707875", "rc": 0, "start": "2022-06-01 12:21:04.704380" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=1851109a-281e-4940-b710-33661f63dc74 /opt/test ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.383) 0:00:41.731 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002649", "end": "2022-06-01 12:21:05.088087", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:21:05.085438" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.377) 0:00:42.108 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.027) 0:00:42.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.031) 0:00:42.168 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.090) 0:00:42.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.038) 0:00:42.297 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.114) 0:00:42.411 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:21:05 +0000 (0:00:00.034) 0:00:42.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "1851109a-281e-4940-b710-33661f63dc74" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "1851109a-281e-4940-b710-33661f63dc74" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.043) 0:00:42.490 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.038) 0:00:42.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.033) 0:00:42.562 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.034) 0:00:42.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.030) 0:00:42.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.029) 0:00:42.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.028) 0:00:42.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.029) 0:00:42.715 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=1851109a-281e-4940-b710-33661f63dc74 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.043) 0:00:42.758 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.031) 0:00:42.790 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.035) 0:00:42.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.027) 0:00:42.854 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.030) 0:00:42.885 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.035) 0:00:42.920 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.035) 0:00:42.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100460.2091215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100460.2091215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100460.2091215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.397) 0:00:43.354 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.037) 0:00:43.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.034) 0:00:43.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:21:06 +0000 (0:00:00.032) 0:00:43.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:43.488 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.033) 0:00:43.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.030) 0:00:43.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.030) 0:00:43.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.032) 0:00:43.614 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.036) 0:00:43.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:43.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:43.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.028) 0:00:43.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:43.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.030) 0:00:43.799 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.035) 0:00:43.834 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.033) 0:00:43.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:43.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.033) 0:00:43.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:43.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.032) 0:00:43.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.028) 0:00:44.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:44.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.030) 0:00:44.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:44.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.030) 0:00:44.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.033) 0:00:44.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:44.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:44.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.030) 0:00:44.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:44.295 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.031) 0:00:44.327 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.037) 0:00:44.365 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.031) 0:00:44.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.030) 0:00:44.427 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:21:07 +0000 (0:00:00.029) 0:00:44.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.030) 0:00:44.487 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.032) 0:00:44.520 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.068) 0:00:44.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.030) 0:00:44.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.028) 0:00:44.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.028) 0:00:44.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.029) 0:00:44.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.029) 0:00:44.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.030) 0:00:44.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.028) 0:00:44.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.028) 0:00:44.821 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.028) 0:00:44.850 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:51 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.029) 0:00:44.880 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.068) 0:00:44.948 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:21:08 +0000 (0:00:00.043) 0:00:44.992 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.530) 0:00:45.523 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.069) 0:00:45.593 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.031) 0:00:45.624 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.029) 0:00:45.653 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.060) 0:00:45.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.026) 0:00:45.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.029) 0:00:45.770 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.030) 0:00:45.800 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.035) 0:00:45.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.032) 0:00:45.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.031) 0:00:45.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.029) 0:00:45.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.029) 0:00:45.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.042) 0:00:46.001 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:21:09 +0000 (0:00:00.027) 0:00:46.028 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:21:10 +0000 (0:00:01.068) 0:00:47.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:21:10 +0000 (0:00:00.030) 0:00:47.128 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:21:10 +0000 (0:00:00.027) 0:00:47.156 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:21:10 +0000 (0:00:00.037) 0:00:47.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:21:10 +0000 (0:00:00.033) 0:00:47.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:21:10 +0000 (0:00:00.035) 0:00:47.262 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:21:10 +0000 (0:00:00.026) 0:00:47.289 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:21:11 +0000 (0:00:00.655) 0:00:47.944 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=1851109a-281e-4940-b710-33661f63dc74', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:21:11 +0000 (0:00:00.394) 0:00:48.339 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:21:12 +0000 (0:00:00.664) 0:00:49.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:21:12 +0000 (0:00:00.376) 0:00:49.379 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:21:12 +0000 (0:00:00.030) 0:00:49.409 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:62 Wednesday 01 June 2022 16:21:13 +0000 (0:00:00.857) 0:00:50.266 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:21:13 +0000 (0:00:00.060) 0:00:50.327 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:21:13 +0000 (0:00:00.029) 0:00:50.356 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:21:13 +0000 (0:00:00.037) 0:00:50.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1851109a-281e-4940-b710-33661f63dc74" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:21:14 +0000 (0:00:00.373) 0:00:50.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002526", "end": "2022-06-01 12:21:14.122816", "rc": 0, "start": "2022-06-01 12:21:14.120290" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=1851109a-281e-4940-b710-33661f63dc74 /opt/test ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:21:14 +0000 (0:00:00.374) 0:00:51.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002480", "end": "2022-06-01 12:21:14.487839", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:21:14.485359" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.366) 0:00:51.509 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.033) 0:00:51.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.033) 0:00:51.575 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.060) 0:00:51.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.036) 0:00:51.672 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.106) 0:00:51.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.035) 0:00:51.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "1851109a-281e-4940-b710-33661f63dc74" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "1851109a-281e-4940-b710-33661f63dc74" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.084) 0:00:51.898 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.036) 0:00:51.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.033) 0:00:51.969 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.036) 0:00:52.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.028) 0:00:52.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.028) 0:00:52.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.032) 0:00:52.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.030) 0:00:52.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=1851109a-281e-4940-b710-33661f63dc74 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.045) 0:00:52.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.033) 0:00:52.204 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.034) 0:00:52.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.029) 0:00:52.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.032) 0:00:52.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.037) 0:00:52.339 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:21:15 +0000 (0:00:00.035) 0:00:52.375 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100460.2091215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100460.2091215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100460.2091215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.375) 0:00:52.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.036) 0:00:52.786 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.034) 0:00:52.821 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.032) 0:00:52.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.030) 0:00:52.883 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.034) 0:00:52.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.031) 0:00:52.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.029) 0:00:52.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.028) 0:00:53.008 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.036) 0:00:53.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.032) 0:00:53.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.029) 0:00:53.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.030) 0:00:53.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.031) 0:00:53.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.030) 0:00:53.199 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.036) 0:00:53.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.036) 0:00:53.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.030) 0:00:53.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.030) 0:00:53.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.029) 0:00:53.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.030) 0:00:53.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.029) 0:00:53.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.031) 0:00:53.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:21:16 +0000 (0:00:00.029) 0:00:53.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.029) 0:00:53.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.029) 0:00:53.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.029) 0:00:53.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.029) 0:00:53.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.031) 0:00:53.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.029) 0:00:53.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.029) 0:00:53.693 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.030) 0:00:53.724 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.028) 0:00:53.752 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.027) 0:00:53.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.030) 0:00:53.811 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.029) 0:00:53.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.030) 0:00:53.871 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.033) 0:00:53.904 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.031) 0:00:53.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.030) 0:00:53.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.033) 0:00:53.999 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.035) 0:00:54.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.032) 0:00:54.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.033) 0:00:54.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.039) 0:00:54.141 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.054) 0:00:54.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.053) 0:00:54.249 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.103) 0:00:54.353 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:64 Wednesday 01 June 2022 16:21:17 +0000 (0:00:00.046) 0:00:54.399 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.102) 0:00:54.502 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.053) 0:00:54.555 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.518) 0:00:55.074 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.070) 0:00:55.144 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.032) 0:00:55.177 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.033) 0:00:55.211 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.062) 0:00:55.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.023) 0:00:55.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.031) 0:00:55.328 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.034) 0:00:55.362 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.037) 0:00:55.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.032) 0:00:55.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:21:18 +0000 (0:00:00.031) 0:00:55.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:21:19 +0000 (0:00:00.030) 0:00:55.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:21:19 +0000 (0:00:00.032) 0:00:55.527 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:21:19 +0000 (0:00:00.044) 0:00:55.572 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:21:19 +0000 (0:00:00.027) 0:00:55.600 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:21:20 +0000 (0:00:01.359) 0:00:56.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:21:20 +0000 (0:00:00.031) 0:00:56.990 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:21:20 +0000 (0:00:00.028) 0:00:57.019 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:21:20 +0000 (0:00:00.039) 0:00:57.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:21:20 +0000 (0:00:00.035) 0:00:57.093 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:21:20 +0000 (0:00:00.035) 0:00:57.129 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=1851109a-281e-4940-b710-33661f63dc74', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=1851109a-281e-4940-b710-33661f63dc74" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:21:21 +0000 (0:00:00.392) 0:00:57.521 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:21:21 +0000 (0:00:00.659) 0:00:58.181 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:21:21 +0000 (0:00:00.028) 0:00:58.210 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:21:22 +0000 (0:00:00.637) 0:00:58.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:21:22 +0000 (0:00:00.398) 0:00:59.246 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:21:22 +0000 (0:00:00.028) 0:00:59.274 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:75 Wednesday 01 June 2022 16:21:23 +0000 (0:00:00.866) 0:01:00.141 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:21:23 +0000 (0:00:00.064) 0:01:00.205 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:21:23 +0000 (0:00:00.028) 0:01:00.234 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=1851109a-281e-4940-b710-33661f63dc74", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:21:23 +0000 (0:00:00.036) 0:01:00.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:21:24 +0000 (0:00:00.386) 0:01:00.657 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002675", "end": "2022-06-01 12:21:24.011243", "rc": 0, "start": "2022-06-01 12:21:24.008568" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:21:24 +0000 (0:00:00.372) 0:01:01.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002496", "end": "2022-06-01 12:21:24.375591", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:21:24.373095" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:21:24 +0000 (0:00:00.371) 0:01:01.400 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:21:24 +0000 (0:00:00.028) 0:01:01.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:21:24 +0000 (0:00:00.031) 0:01:01.460 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.062) 0:01:01.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.034) 0:01:01.556 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.111) 0:01:01.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.036) 0:01:01.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.045) 0:01:01.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.030) 0:01:01.781 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.036) 0:01:01.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.028) 0:01:01.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.080) 0:01:01.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.030) 0:01:01.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.029) 0:01:01.986 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.030) 0:01:02.017 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.044) 0:01:02.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.026) 0:01:02.088 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.041) 0:01:02.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.029) 0:01:02.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.030) 0:01:02.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.029) 0:01:02.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:21:25 +0000 (0:00:00.025) 0:01:02.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100479.8721216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100479.8721216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100479.8721216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.399) 0:01:02.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.038) 0:01:02.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.026) 0:01:02.709 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.033) 0:01:02.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.029) 0:01:02.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.025) 0:01:02.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.028) 0:01:02.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.029) 0:01:02.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.031) 0:01:02.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.025) 0:01:02.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.029) 0:01:02.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.030) 0:01:02.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.030) 0:01:03.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.030) 0:01:03.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.032) 0:01:03.066 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.035) 0:01:03.102 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.037) 0:01:03.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.029) 0:01:03.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.029) 0:01:03.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.029) 0:01:03.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.031) 0:01:03.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.035) 0:01:03.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.037) 0:01:03.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.032) 0:01:03.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.032) 0:01:03.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.031) 0:01:03.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:21:26 +0000 (0:00:00.033) 0:01:03.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.031) 0:01:03.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.031) 0:01:03.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.031) 0:01:03.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.031) 0:01:03.588 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.033) 0:01:03.621 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.036) 0:01:03.658 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:03.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:03.718 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.030) 0:01:03.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:03.777 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.033) 0:01:03.811 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.033) 0:01:03.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.028) 0:01:03.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:03.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:03.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:03.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:03.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.032) 0:01:04.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.029) 0:01:04.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.030) 0:01:04.084 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.030) 0:01:04.114 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=260 changed=9 unreachable=0 failed=0 skipped=228 rescued=0 ignored=0 Wednesday 01 June 2022 16:21:27 +0000 (0:00:00.016) 0:01:04.130 ******** =============================================================================== linux-system-roles.storage : make sure blivet is available ------------- 14.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : get service facts -------------------------- 1.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.47s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.36s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.30s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.03s /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:2 ----------------------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:21:28 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:21:29 +0000 (0:00:01.281) 0:00:01.304 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_disk_fs_nvme_generated.yml ****************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_disk_fs_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:21:29 +0000 (0:00:00.017) 0:00:01.322 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:21:30 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:21:31 +0000 (0:00:01.239) 0:00:01.261 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_disk_fs_scsi_generated.yml ****************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_disk_fs_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs_scsi_generated.yml:3 Wednesday 01 June 2022 16:21:31 +0000 (0:00:00.015) 0:00:01.276 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs_scsi_generated.yml:7 Wednesday 01 June 2022 16:21:32 +0000 (0:00:01.084) 0:00:02.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:2 Wednesday 01 June 2022 16:21:32 +0000 (0:00:00.027) 0:00:02.389 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:11 Wednesday 01 June 2022 16:21:33 +0000 (0:00:00.802) 0:00:03.191 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:21:33 +0000 (0:00:00.039) 0:00:03.230 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:21:33 +0000 (0:00:00.158) 0:00:03.389 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:21:34 +0000 (0:00:00.514) 0:00:03.903 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:21:34 +0000 (0:00:00.074) 0:00:03.977 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:21:34 +0000 (0:00:00.023) 0:00:04.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:21:34 +0000 (0:00:00.022) 0:00:04.023 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:21:34 +0000 (0:00:00.196) 0:00:04.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:21:34 +0000 (0:00:00.020) 0:00:04.240 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:21:35 +0000 (0:00:01.079) 0:00:05.319 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:21:35 +0000 (0:00:00.044) 0:00:05.364 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:21:35 +0000 (0:00:00.047) 0:00:05.411 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:21:36 +0000 (0:00:00.667) 0:00:06.078 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:21:36 +0000 (0:00:00.080) 0:00:06.159 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:21:36 +0000 (0:00:00.019) 0:00:06.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:21:36 +0000 (0:00:00.021) 0:00:06.201 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:21:36 +0000 (0:00:00.024) 0:00:06.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:21:37 +0000 (0:00:00.810) 0:00:07.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:21:39 +0000 (0:00:01.863) 0:00:08.900 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:21:39 +0000 (0:00:00.044) 0:00:08.944 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:21:39 +0000 (0:00:00.056) 0:00:09.000 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:21:39 +0000 (0:00:00.525) 0:00:09.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:21:39 +0000 (0:00:00.030) 0:00:09.557 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.030) 0:00:09.588 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.034) 0:00:09.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.034) 0:00:09.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.033) 0:00:09.690 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.027) 0:00:09.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.026) 0:00:09.745 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.026) 0:00:09.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.027) 0:00:09.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.452) 0:00:10.251 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:21:40 +0000 (0:00:00.029) 0:00:10.280 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:14 Wednesday 01 June 2022 16:21:41 +0000 (0:00:00.833) 0:00:11.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:21 Wednesday 01 June 2022 16:21:41 +0000 (0:00:00.030) 0:00:11.144 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:21:41 +0000 (0:00:00.043) 0:00:11.188 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.525) 0:00:11.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.036) 0:00:11.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.029) 0:00:11.779 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk device with the default file system type] ****************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:26 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.031) 0:00:11.811 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.055) 0:00:11.866 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.041) 0:00:11.908 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.512) 0:00:12.421 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.075) 0:00:12.497 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.029) 0:00:12.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:21:42 +0000 (0:00:00.030) 0:00:12.557 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.061) 0:00:12.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.028) 0:00:12.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.061) 0:00:12.708 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.033) 0:00:12.741 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.036) 0:00:12.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.029) 0:00:12.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.029) 0:00:12.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.029) 0:00:12.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.030) 0:00:12.897 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.042) 0:00:12.939 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:21:43 +0000 (0:00:00.027) 0:00:12.967 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=db315365-661c-471b-89a9-554f6339c83f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:21:44 +0000 (0:00:01.269) 0:00:14.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:21:44 +0000 (0:00:00.033) 0:00:14.269 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:21:44 +0000 (0:00:00.028) 0:00:14.298 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=db315365-661c-471b-89a9-554f6339c83f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:21:44 +0000 (0:00:00.036) 0:00:14.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:21:44 +0000 (0:00:00.034) 0:00:14.369 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=db315365-661c-471b-89a9-554f6339c83f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:21:44 +0000 (0:00:00.033) 0:00:14.403 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:21:44 +0000 (0:00:00.039) 0:00:14.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:21:45 +0000 (0:00:00.948) 0:00:15.391 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=db315365-661c-471b-89a9-554f6339c83f', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:21:46 +0000 (0:00:00.569) 0:00:15.961 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:21:47 +0000 (0:00:00.677) 0:00:16.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:21:47 +0000 (0:00:00.380) 0:00:17.019 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:21:47 +0000 (0:00:00.028) 0:00:17.047 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:36 Wednesday 01 June 2022 16:21:48 +0000 (0:00:00.873) 0:00:17.920 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:21:48 +0000 (0:00:00.054) 0:00:17.975 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:21:48 +0000 (0:00:00.030) 0:00:18.005 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=db315365-661c-471b-89a9-554f6339c83f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:21:48 +0000 (0:00:00.037) 0:00:18.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "db315365-661c-471b-89a9-554f6339c83f" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:21:48 +0000 (0:00:00.518) 0:00:18.561 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002423", "end": "2022-06-01 12:21:48.906837", "rc": 0, "start": "2022-06-01 12:21:48.904414" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=db315365-661c-471b-89a9-554f6339c83f /opt/test xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:21:49 +0000 (0:00:00.464) 0:00:19.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002523", "end": "2022-06-01 12:21:49.277652", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:21:49.275129" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:21:49 +0000 (0:00:00.367) 0:00:19.393 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:21:49 +0000 (0:00:00.028) 0:00:19.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:21:49 +0000 (0:00:00.030) 0:00:19.453 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:21:49 +0000 (0:00:00.062) 0:00:19.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:21:49 +0000 (0:00:00.035) 0:00:19.551 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.121) 0:00:19.672 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.033) 0:00:19.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "db315365-661c-471b-89a9-554f6339c83f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "db315365-661c-471b-89a9-554f6339c83f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.038) 0:00:19.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.033) 0:00:19.778 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.032) 0:00:19.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.034) 0:00:19.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.029) 0:00:19.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.031) 0:00:19.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.028) 0:00:19.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.029) 0:00:19.964 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=db315365-661c-471b-89a9-554f6339c83f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.044) 0:00:20.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.033) 0:00:20.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.035) 0:00:20.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.031) 0:00:20.110 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.030) 0:00:20.140 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.037) 0:00:20.177 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:21:50 +0000 (0:00:00.037) 0:00:20.214 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100504.0661216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100504.0661216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100504.0661216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.377) 0:00:20.592 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.037) 0:00:20.629 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.035) 0:00:20.664 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.031) 0:00:20.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.027) 0:00:20.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.034) 0:00:20.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.029) 0:00:20.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.028) 0:00:20.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.028) 0:00:20.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.037) 0:00:20.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.029) 0:00:20.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.028) 0:00:20.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.028) 0:00:20.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.028) 0:00:20.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.029) 0:00:21.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.039) 0:00:21.066 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.065) 0:00:21.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.030) 0:00:21.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.030) 0:00:21.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.030) 0:00:21.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.032) 0:00:21.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.032) 0:00:21.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.033) 0:00:21.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.030) 0:00:21.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.029) 0:00:21.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.030) 0:00:21.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.029) 0:00:21.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.029) 0:00:21.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.032) 0:00:21.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.032) 0:00:21.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:21:51 +0000 (0:00:00.030) 0:00:21.568 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.033) 0:00:21.601 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:21.631 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:21.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.031) 0:00:21.691 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:21.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:21.750 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.032) 0:00:21.783 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.030) 0:00:21.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.028) 0:00:21.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.031) 0:00:21.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:21.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:21.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.028) 0:00:21.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:21.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.029) 0:00:22.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.030) 0:00:22.049 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.030) 0:00:22.079 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the disk device file system type to "ext4"] *********************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:38 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.035) 0:00:22.114 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.070) 0:00:22.185 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:21:52 +0000 (0:00:00.046) 0:00:22.232 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.524) 0:00:22.756 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.070) 0:00:22.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.031) 0:00:22.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.029) 0:00:22.888 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.058) 0:00:22.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.033) 0:00:22.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.032) 0:00:23.011 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.033) 0:00:23.045 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.041) 0:00:23.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.030) 0:00:23.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.029) 0:00:23.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.032) 0:00:23.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.029) 0:00:23.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.042) 0:00:23.251 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:21:53 +0000 (0:00:00.027) 0:00:23.279 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:21:55 +0000 (0:00:01.407) 0:00:24.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:21:55 +0000 (0:00:00.030) 0:00:24.717 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:21:55 +0000 (0:00:00.027) 0:00:24.745 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:21:55 +0000 (0:00:00.039) 0:00:24.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:21:55 +0000 (0:00:00.035) 0:00:24.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:21:55 +0000 (0:00:00.037) 0:00:24.858 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=db315365-661c-471b-89a9-554f6339c83f', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=db315365-661c-471b-89a9-554f6339c83f" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:21:55 +0000 (0:00:00.376) 0:00:25.234 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:21:56 +0000 (0:00:00.647) 0:00:25.881 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:21:56 +0000 (0:00:00.408) 0:00:26.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:21:57 +0000 (0:00:00.618) 0:00:26.908 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:21:57 +0000 (0:00:00.367) 0:00:27.276 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:21:57 +0000 (0:00:00.029) 0:00:27.306 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:49 Wednesday 01 June 2022 16:21:58 +0000 (0:00:00.828) 0:00:28.134 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:21:58 +0000 (0:00:00.062) 0:00:28.196 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:21:58 +0000 (0:00:00.031) 0:00:28.228 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:21:58 +0000 (0:00:00.038) 0:00:28.266 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "a0e87a9c-269b-4a0e-b8b0-915d20d10852" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:21:59 +0000 (0:00:00.377) 0:00:28.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002327", "end": "2022-06-01 12:21:58.886817", "rc": 0, "start": "2022-06-01 12:21:58.884490" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852 /opt/test ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:21:59 +0000 (0:00:00.359) 0:00:29.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003098", "end": "2022-06-01 12:21:59.253142", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:21:59.250044" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:21:59 +0000 (0:00:00.369) 0:00:29.374 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:21:59 +0000 (0:00:00.028) 0:00:29.402 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:21:59 +0000 (0:00:00.031) 0:00:29.434 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:21:59 +0000 (0:00:00.060) 0:00:29.494 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:21:59 +0000 (0:00:00.033) 0:00:29.528 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.164) 0:00:29.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.037) 0:00:29.730 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "a0e87a9c-269b-4a0e-b8b0-915d20d10852" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "a0e87a9c-269b-4a0e-b8b0-915d20d10852" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.043) 0:00:29.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.038) 0:00:29.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.034) 0:00:29.846 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.037) 0:00:29.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.029) 0:00:29.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.032) 0:00:29.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.030) 0:00:29.976 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.030) 0:00:30.007 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.045) 0:00:30.053 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.034) 0:00:30.087 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.036) 0:00:30.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.033) 0:00:30.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.031) 0:00:30.188 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.037) 0:00:30.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:22:00 +0000 (0:00:00.036) 0:00:30.262 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100514.5161216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100514.5161216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100514.5161216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.388) 0:00:30.650 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.037) 0:00:30.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.033) 0:00:30.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.032) 0:00:30.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.029) 0:00:30.784 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.034) 0:00:30.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.029) 0:00:30.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.029) 0:00:30.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.028) 0:00:30.906 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.035) 0:00:30.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.027) 0:00:30.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.030) 0:00:31.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.029) 0:00:31.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.030) 0:00:31.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.030) 0:00:31.090 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.041) 0:00:31.132 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.034) 0:00:31.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.032) 0:00:31.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.031) 0:00:31.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.031) 0:00:31.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.045) 0:00:31.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.044) 0:00:31.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.032) 0:00:31.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.031) 0:00:31.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.029) 0:00:31.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.030) 0:00:31.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.030) 0:00:31.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.033) 0:00:31.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:22:01 +0000 (0:00:00.030) 0:00:31.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.030) 0:00:31.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.030) 0:00:31.631 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.032) 0:00:31.664 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.030) 0:00:31.694 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.032) 0:00:31.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.029) 0:00:31.756 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.029) 0:00:31.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.030) 0:00:31.817 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.035) 0:00:31.852 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.035) 0:00:31.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.037) 0:00:31.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.071) 0:00:31.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.030) 0:00:32.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.029) 0:00:32.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.031) 0:00:32.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.035) 0:00:32.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.031) 0:00:32.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.033) 0:00:32.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.030) 0:00:32.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:51 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.031) 0:00:32.253 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.068) 0:00:32.321 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:22:02 +0000 (0:00:00.043) 0:00:32.365 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.525) 0:00:32.891 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.069) 0:00:32.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.031) 0:00:32.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.030) 0:00:33.023 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.060) 0:00:33.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.027) 0:00:33.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.030) 0:00:33.142 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.032) 0:00:33.174 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.036) 0:00:33.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.029) 0:00:33.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.029) 0:00:33.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.031) 0:00:33.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.030) 0:00:33.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.044) 0:00:33.376 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:22:03 +0000 (0:00:00.026) 0:00:33.402 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:22:04 +0000 (0:00:01.040) 0:00:34.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:22:04 +0000 (0:00:00.031) 0:00:34.475 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:22:04 +0000 (0:00:00.028) 0:00:34.503 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:22:04 +0000 (0:00:00.037) 0:00:34.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:22:04 +0000 (0:00:00.033) 0:00:34.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:22:05 +0000 (0:00:00.035) 0:00:34.610 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:22:05 +0000 (0:00:00.028) 0:00:34.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:22:05 +0000 (0:00:00.652) 0:00:35.290 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:22:06 +0000 (0:00:00.392) 0:00:35.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:22:06 +0000 (0:00:00.633) 0:00:36.316 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:22:07 +0000 (0:00:00.359) 0:00:36.675 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:22:07 +0000 (0:00:00.030) 0:00:36.706 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:62 Wednesday 01 June 2022 16:22:07 +0000 (0:00:00.814) 0:00:37.520 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:22:07 +0000 (0:00:00.060) 0:00:37.580 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:22:08 +0000 (0:00:00.029) 0:00:37.609 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:22:08 +0000 (0:00:00.039) 0:00:37.649 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "a0e87a9c-269b-4a0e-b8b0-915d20d10852" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:22:08 +0000 (0:00:00.383) 0:00:38.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002672", "end": "2022-06-01 12:22:08.273030", "rc": 0, "start": "2022-06-01 12:22:08.270358" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852 /opt/test ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:22:08 +0000 (0:00:00.358) 0:00:38.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002639", "end": "2022-06-01 12:22:08.653244", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:22:08.650605" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.381) 0:00:38.772 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.028) 0:00:38.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.031) 0:00:38.832 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.058) 0:00:38.891 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.033) 0:00:38.924 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.109) 0:00:39.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.034) 0:00:39.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "a0e87a9c-269b-4a0e-b8b0-915d20d10852" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "a0e87a9c-269b-4a0e-b8b0-915d20d10852" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.042) 0:00:39.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.091) 0:00:39.202 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.037) 0:00:39.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.038) 0:00:39.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.029) 0:00:39.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.030) 0:00:39.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.030) 0:00:39.368 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.033) 0:00:39.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.047) 0:00:39.449 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.035) 0:00:39.485 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.037) 0:00:39.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:22:09 +0000 (0:00:00.031) 0:00:39.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.032) 0:00:39.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.040) 0:00:39.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.036) 0:00:39.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100514.5161216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100514.5161216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100514.5161216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.389) 0:00:40.052 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.037) 0:00:40.089 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.035) 0:00:40.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.034) 0:00:40.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.030) 0:00:40.190 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.038) 0:00:40.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.031) 0:00:40.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.031) 0:00:40.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.029) 0:00:40.320 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.036) 0:00:40.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.030) 0:00:40.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.032) 0:00:40.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.030) 0:00:40.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.029) 0:00:40.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.029) 0:00:40.510 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:22:10 +0000 (0:00:00.037) 0:00:40.547 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.035) 0:00:40.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.031) 0:00:40.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.029) 0:00:40.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.029) 0:00:40.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.030) 0:00:40.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.027) 0:00:40.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.027) 0:00:40.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.030) 0:00:40.789 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.031) 0:00:40.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.028) 0:00:40.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.027) 0:00:40.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.030) 0:00:40.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.030) 0:00:40.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.032) 0:00:40.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.029) 0:00:41.000 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.032) 0:00:41.033 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.032) 0:00:41.065 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.031) 0:00:41.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.031) 0:00:41.128 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.033) 0:00:41.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.031) 0:00:41.193 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.034) 0:00:41.228 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.034) 0:00:41.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.030) 0:00:41.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.035) 0:00:41.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.039) 0:00:41.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.031) 0:00:41.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.032) 0:00:41.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.033) 0:00:41.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.029) 0:00:41.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.029) 0:00:41.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:22:11 +0000 (0:00:00.033) 0:00:41.557 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:64 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.071) 0:00:41.628 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.077) 0:00:41.706 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.046) 0:00:41.752 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.519) 0:00:42.271 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.069) 0:00:42.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.029) 0:00:42.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.033) 0:00:42.403 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.066) 0:00:42.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.028) 0:00:42.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.032) 0:00:42.531 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:22:12 +0000 (0:00:00.033) 0:00:42.564 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:22:13 +0000 (0:00:00.037) 0:00:42.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:22:13 +0000 (0:00:00.037) 0:00:42.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:22:13 +0000 (0:00:00.033) 0:00:42.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:22:13 +0000 (0:00:00.030) 0:00:42.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:22:13 +0000 (0:00:00.036) 0:00:42.739 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:22:13 +0000 (0:00:00.044) 0:00:42.783 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:22:13 +0000 (0:00:00.028) 0:00:42.811 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:22:14 +0000 (0:00:01.275) 0:00:44.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:22:14 +0000 (0:00:00.032) 0:00:44.119 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:22:14 +0000 (0:00:00.028) 0:00:44.147 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:22:14 +0000 (0:00:00.037) 0:00:44.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:22:14 +0000 (0:00:00.034) 0:00:44.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:22:14 +0000 (0:00:00.038) 0:00:44.258 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:22:15 +0000 (0:00:00.386) 0:00:44.645 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:22:15 +0000 (0:00:00.653) 0:00:45.299 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:22:15 +0000 (0:00:00.030) 0:00:45.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:22:16 +0000 (0:00:00.639) 0:00:45.969 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:22:16 +0000 (0:00:00.387) 0:00:46.357 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:22:16 +0000 (0:00:00.029) 0:00:46.387 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:75 Wednesday 01 June 2022 16:22:17 +0000 (0:00:00.817) 0:00:47.204 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:22:17 +0000 (0:00:00.063) 0:00:47.267 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:22:17 +0000 (0:00:00.031) 0:00:47.299 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=a0e87a9c-269b-4a0e-b8b0-915d20d10852", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:22:17 +0000 (0:00:00.036) 0:00:47.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:22:18 +0000 (0:00:00.388) 0:00:47.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003210", "end": "2022-06-01 12:22:17.986441", "rc": 0, "start": "2022-06-01 12:22:17.983231" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:22:18 +0000 (0:00:00.386) 0:00:48.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002518", "end": "2022-06-01 12:22:18.372755", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:22:18.370237" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:22:18 +0000 (0:00:00.380) 0:00:48.490 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:22:18 +0000 (0:00:00.028) 0:00:48.519 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:22:18 +0000 (0:00:00.030) 0:00:48.550 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.058) 0:00:48.608 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.035) 0:00:48.644 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.113) 0:00:48.757 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.066) 0:00:48.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.039) 0:00:48.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.029) 0:00:48.893 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.033) 0:00:48.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.029) 0:00:48.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.029) 0:00:48.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.031) 0:00:49.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.029) 0:00:49.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.031) 0:00:49.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.045) 0:00:49.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.025) 0:00:49.148 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.035) 0:00:49.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.032) 0:00:49.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.031) 0:00:49.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.030) 0:00:49.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:22:19 +0000 (0:00:00.024) 0:00:49.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100533.9001215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100533.9001215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100533.9001215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.394) 0:00:49.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.037) 0:00:49.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.026) 0:00:49.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.034) 0:00:49.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.029) 0:00:49.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.025) 0:00:49.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.029) 0:00:49.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.033) 0:00:49.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.033) 0:00:49.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.029) 0:00:49.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.030) 0:00:50.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.030) 0:00:50.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.030) 0:00:50.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.030) 0:00:50.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.031) 0:00:50.130 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.041) 0:00:50.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.036) 0:00:50.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.031) 0:00:50.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.030) 0:00:50.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.030) 0:00:50.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.032) 0:00:50.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.034) 0:00:50.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.032) 0:00:50.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.031) 0:00:50.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.031) 0:00:50.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.031) 0:00:50.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.031) 0:00:50.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:22:20 +0000 (0:00:00.034) 0:00:50.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.033) 0:00:50.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.033) 0:00:50.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.032) 0:00:50.663 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.035) 0:00:50.698 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.032) 0:00:50.731 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.033) 0:00:50.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.031) 0:00:50.796 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.032) 0:00:50.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.031) 0:00:50.860 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.033) 0:00:50.894 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.032) 0:00:50.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.032) 0:00:50.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.029) 0:00:50.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.030) 0:00:51.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.030) 0:00:51.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.031) 0:00:51.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.033) 0:00:51.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.032) 0:00:51.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.069) 0:00:51.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.031) 0:00:51.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=262 changed=7 unreachable=0 failed=0 skipped=228 rescued=0 ignored=0 Wednesday 01 June 2022 16:22:21 +0000 (0:00:00.018) 0:00:51.265 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.41s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.28s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp7247_7fr/tests/tests_change_disk_fs_scsi_generated.yml:3 -------------- linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.80s /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml:2 ----------------------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : get required packages ---------------------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:22:22 +0000 (0:00:00.021) 0:00:00.021 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:22:23 +0000 (0:00:01.319) 0:00:01.341 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_disk_mount.yml ****************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:2 Wednesday 01 June 2022 16:22:23 +0000 (0:00:00.013) 0:00:01.354 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:10 Wednesday 01 June 2022 16:22:24 +0000 (0:00:01.088) 0:00:02.442 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:22:24 +0000 (0:00:00.039) 0:00:02.481 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:22:25 +0000 (0:00:00.157) 0:00:02.639 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:22:25 +0000 (0:00:00.536) 0:00:03.176 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:22:25 +0000 (0:00:00.078) 0:00:03.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:22:25 +0000 (0:00:00.024) 0:00:03.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:22:25 +0000 (0:00:00.022) 0:00:03.301 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:22:25 +0000 (0:00:00.188) 0:00:03.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:22:25 +0000 (0:00:00.018) 0:00:03.508 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:22:26 +0000 (0:00:01.090) 0:00:04.598 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:22:27 +0000 (0:00:00.047) 0:00:04.646 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:22:27 +0000 (0:00:00.044) 0:00:04.690 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:22:27 +0000 (0:00:00.682) 0:00:05.373 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:22:27 +0000 (0:00:00.078) 0:00:05.452 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:22:27 +0000 (0:00:00.021) 0:00:05.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:22:27 +0000 (0:00:00.023) 0:00:05.497 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:22:27 +0000 (0:00:00.022) 0:00:05.520 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:22:28 +0000 (0:00:00.799) 0:00:06.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:22:30 +0000 (0:00:01.848) 0:00:08.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:22:30 +0000 (0:00:00.043) 0:00:08.213 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:22:30 +0000 (0:00:00.027) 0:00:08.240 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.557) 0:00:08.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.030) 0:00:08.828 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.027) 0:00:08.856 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.033) 0:00:08.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.031) 0:00:08.921 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.032) 0:00:08.953 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.060) 0:00:09.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.029) 0:00:09.043 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.028) 0:00:09.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.029) 0:00:09.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.474) 0:00:09.576 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:22:31 +0000 (0:00:00.028) 0:00:09.604 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:13 Wednesday 01 June 2022 16:22:32 +0000 (0:00:00.840) 0:00:10.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:20 Wednesday 01 June 2022 16:22:32 +0000 (0:00:00.032) 0:00:10.477 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:22:32 +0000 (0:00:00.045) 0:00:10.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:22:33 +0000 (0:00:00.504) 0:00:11.028 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:22:33 +0000 (0:00:00.033) 0:00:11.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:22:33 +0000 (0:00:00.029) 0:00:11.090 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk device mounted at "/opt/test1"] **************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:25 Wednesday 01 June 2022 16:22:33 +0000 (0:00:00.031) 0:00:11.122 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:22:33 +0000 (0:00:00.051) 0:00:11.173 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:22:33 +0000 (0:00:00.039) 0:00:11.213 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.511) 0:00:11.724 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.067) 0:00:11.792 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.029) 0:00:11.821 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.029) 0:00:11.850 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.059) 0:00:11.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.025) 0:00:11.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.027) 0:00:11.963 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.029) 0:00:11.992 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.034) 0:00:12.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.030) 0:00:12.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.027) 0:00:12.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.027) 0:00:12.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.028) 0:00:12.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.042) 0:00:12.184 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:22:34 +0000 (0:00:00.026) 0:00:12.211 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:22:35 +0000 (0:00:01.280) 0:00:13.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:22:35 +0000 (0:00:00.030) 0:00:13.522 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:22:35 +0000 (0:00:00.057) 0:00:13.579 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:22:36 +0000 (0:00:00.037) 0:00:13.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:22:36 +0000 (0:00:00.033) 0:00:13.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:22:36 +0000 (0:00:00.034) 0:00:13.684 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:22:36 +0000 (0:00:00.027) 0:00:13.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:22:36 +0000 (0:00:00.894) 0:00:14.607 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:22:37 +0000 (0:00:00.510) 0:00:15.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:22:38 +0000 (0:00:00.654) 0:00:15.772 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:22:38 +0000 (0:00:00.371) 0:00:16.144 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:22:38 +0000 (0:00:00.028) 0:00:16.173 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:35 Wednesday 01 June 2022 16:22:39 +0000 (0:00:00.827) 0:00:17.001 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:22:39 +0000 (0:00:00.052) 0:00:17.053 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:22:39 +0000 (0:00:00.029) 0:00:17.082 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:22:39 +0000 (0:00:00.037) 0:00:17.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:22:40 +0000 (0:00:00.510) 0:00:17.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002496", "end": "2022-06-01 12:22:39.979003", "rc": 0, "start": "2022-06-01 12:22:39.976507" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:22:40 +0000 (0:00:00.495) 0:00:18.125 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002451", "end": "2022-06-01 12:22:40.348202", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:22:40.345751" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:22:40 +0000 (0:00:00.367) 0:00:18.492 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:22:40 +0000 (0:00:00.028) 0:00:18.521 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:22:40 +0000 (0:00:00.030) 0:00:18.551 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.058) 0:00:18.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.033) 0:00:18.642 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.143) 0:00:18.786 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.032) 0:00:18.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.041) 0:00:18.860 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.035) 0:00:18.895 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.032) 0:00:18.928 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.035) 0:00:18.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.027) 0:00:18.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.027) 0:00:19.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.030) 0:00:19.049 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.029) 0:00:19.078 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.044) 0:00:19.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.033) 0:00:19.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.034) 0:00:19.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.028) 0:00:19.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.031) 0:00:19.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.039) 0:00:19.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:22:41 +0000 (0:00:00.041) 0:00:19.332 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100555.2921214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100555.2921214, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100555.2921214, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.392) 0:00:19.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.034) 0:00:19.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.035) 0:00:19.795 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.034) 0:00:19.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.032) 0:00:19.862 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.032) 0:00:19.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.028) 0:00:19.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.029) 0:00:19.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.030) 0:00:19.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.034) 0:00:20.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.029) 0:00:20.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.027) 0:00:20.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.030) 0:00:20.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.033) 0:00:20.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.034) 0:00:20.172 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.036) 0:00:20.209 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.036) 0:00:20.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.030) 0:00:20.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.030) 0:00:20.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.028) 0:00:20.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.029) 0:00:20.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.027) 0:00:20.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.031) 0:00:20.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.030) 0:00:20.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.033) 0:00:20.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.034) 0:00:20.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.031) 0:00:20.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:22:42 +0000 (0:00:00.028) 0:00:20.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.031) 0:00:20.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.031) 0:00:20.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.030) 0:00:20.676 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.032) 0:00:20.709 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:20.738 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:20.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.032) 0:00:20.800 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:20.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:20.859 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.032) 0:00:20.891 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.030) 0:00:20.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.027) 0:00:20.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.071) 0:00:21.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:21.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:21.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.031) 0:00:21.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:21.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.028) 0:00:21.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.032) 0:00:21.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.029) 0:00:21.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the disk device mount location to "/opt/test2"] ******************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:37 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.030) 0:00:21.262 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.059) 0:00:21.321 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:22:43 +0000 (0:00:00.044) 0:00:21.366 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.497) 0:00:21.864 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.071) 0:00:21.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.031) 0:00:21.967 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.030) 0:00:21.997 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.059) 0:00:22.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.025) 0:00:22.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.034) 0:00:22.117 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.038) 0:00:22.156 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test2", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.036) 0:00:22.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.029) 0:00:22.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.029) 0:00:22.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.028) 0:00:22.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.029) 0:00:22.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.047) 0:00:22.356 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:22:44 +0000 (0:00:00.027) 0:00:22.384 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:22:45 +0000 (0:00:01.048) 0:00:23.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:22:45 +0000 (0:00:00.031) 0:00:23.464 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:22:45 +0000 (0:00:00.029) 0:00:23.493 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:22:45 +0000 (0:00:00.037) 0:00:23.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:22:45 +0000 (0:00:00.034) 0:00:23.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:22:45 +0000 (0:00:00.035) 0:00:23.601 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:22:46 +0000 (0:00:00.387) 0:00:23.989 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:22:47 +0000 (0:00:00.663) 0:00:24.652 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:22:47 +0000 (0:00:00.412) 0:00:25.065 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:22:48 +0000 (0:00:00.669) 0:00:25.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:22:48 +0000 (0:00:00.362) 0:00:26.097 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:22:48 +0000 (0:00:00.029) 0:00:26.126 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:47 Wednesday 01 June 2022 16:22:49 +0000 (0:00:00.864) 0:00:26.991 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:22:49 +0000 (0:00:00.055) 0:00:27.047 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:22:49 +0000 (0:00:00.027) 0:00:27.074 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:22:49 +0000 (0:00:00.036) 0:00:27.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:22:49 +0000 (0:00:00.368) 0:00:27.479 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002932", "end": "2022-06-01 12:22:49.710280", "rc": 0, "start": "2022-06-01 12:22:49.707348" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.388) 0:00:27.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002752", "end": "2022-06-01 12:22:50.093429", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:22:50.090677" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.372) 0:00:28.240 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.027) 0:00:28.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.032) 0:00:28.300 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.060) 0:00:28.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.034) 0:00:28.395 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.134) 0:00:28.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.033) 0:00:28.564 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:22:50 +0000 (0:00:00.040) 0:00:28.604 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.038) 0:00:28.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.034) 0:00:28.677 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.036) 0:00:28.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.031) 0:00:28.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.029) 0:00:28.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.029) 0:00:28.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.029) 0:00:28.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.043) 0:00:28.876 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.031) 0:00:28.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.035) 0:00:28.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.029) 0:00:28.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.030) 0:00:29.002 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.034) 0:00:29.037 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.034) 0:00:29.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100555.2921214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100555.2921214, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100555.2921214, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.394) 0:00:29.466 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.035) 0:00:29.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.035) 0:00:29.538 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:22:51 +0000 (0:00:00.039) 0:00:29.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.032) 0:00:29.609 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.036) 0:00:29.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:29.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:29.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.030) 0:00:29.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.036) 0:00:29.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:29.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:29.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:29.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:29.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.031) 0:00:29.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.036) 0:00:29.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.034) 0:00:29.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.030) 0:00:30.024 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.030) 0:00:30.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:30.084 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.033) 0:00:30.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.035) 0:00:30.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.033) 0:00:30.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.032) 0:00:30.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.032) 0:00:30.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.031) 0:00:30.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.034) 0:00:30.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.030) 0:00:30.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.030) 0:00:30.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:30.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.029) 0:00:30.436 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.032) 0:00:30.469 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.037) 0:00:30.506 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.031) 0:00:30.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.031) 0:00:30.569 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:22:52 +0000 (0:00:00.030) 0:00:30.600 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.030) 0:00:30.631 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.034) 0:00:30.665 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.039) 0:00:30.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.030) 0:00:30.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.029) 0:00:30.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.029) 0:00:30.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.029) 0:00:30.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.029) 0:00:30.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.030) 0:00:30.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.029) 0:00:30.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.028) 0:00:30.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.029) 0:00:30.971 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:49 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.029) 0:00:31.000 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.100) 0:00:31.101 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:22:53 +0000 (0:00:00.042) 0:00:31.143 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.505) 0:00:31.649 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.070) 0:00:31.720 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.030) 0:00:31.751 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.029) 0:00:31.780 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.057) 0:00:31.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.025) 0:00:31.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.029) 0:00:31.893 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.032) 0:00:31.926 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test2", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.035) 0:00:31.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.029) 0:00:31.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.028) 0:00:32.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.030) 0:00:32.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.027) 0:00:32.078 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.041) 0:00:32.120 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:22:54 +0000 (0:00:00.026) 0:00:32.146 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:22:55 +0000 (0:00:01.082) 0:00:33.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:22:55 +0000 (0:00:00.030) 0:00:33.259 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:22:55 +0000 (0:00:00.027) 0:00:33.286 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:22:55 +0000 (0:00:00.036) 0:00:33.323 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:22:55 +0000 (0:00:00.032) 0:00:33.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:22:55 +0000 (0:00:00.034) 0:00:33.390 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:22:55 +0000 (0:00:00.031) 0:00:33.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:22:56 +0000 (0:00:00.665) 0:00:34.087 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:22:56 +0000 (0:00:00.405) 0:00:34.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:22:57 +0000 (0:00:00.631) 0:00:35.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:22:57 +0000 (0:00:00.391) 0:00:35.515 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:22:57 +0000 (0:00:00.028) 0:00:35.543 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:59 Wednesday 01 June 2022 16:22:58 +0000 (0:00:00.855) 0:00:36.399 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:22:58 +0000 (0:00:00.063) 0:00:36.462 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:22:58 +0000 (0:00:00.034) 0:00:36.496 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:22:58 +0000 (0:00:00.040) 0:00:36.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:22:59 +0000 (0:00:00.383) 0:00:36.920 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002548", "end": "2022-06-01 12:22:59.156198", "rc": 0, "start": "2022-06-01 12:22:59.153650" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:22:59 +0000 (0:00:00.380) 0:00:37.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002537", "end": "2022-06-01 12:22:59.525486", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:22:59.522949" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.376) 0:00:37.677 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.028) 0:00:37.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.030) 0:00:37.737 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.060) 0:00:37.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.033) 0:00:37.831 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.108) 0:00:37.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.034) 0:00:37.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.044) 0:00:38.018 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.034) 0:00:38.053 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.033) 0:00:38.087 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.036) 0:00:38.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.029) 0:00:38.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.030) 0:00:38.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.031) 0:00:38.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.031) 0:00:38.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.049) 0:00:38.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.033) 0:00:38.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.036) 0:00:38.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.029) 0:00:38.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.032) 0:00:38.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.035) 0:00:38.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:23:00 +0000 (0:00:00.035) 0:00:38.500 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100555.2921214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100555.2921214, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100555.2921214, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.379) 0:00:38.879 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.034) 0:00:38.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.035) 0:00:38.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.032) 0:00:38.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.029) 0:00:39.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.035) 0:00:39.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.030) 0:00:39.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.029) 0:00:39.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.028) 0:00:39.135 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.036) 0:00:39.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.029) 0:00:39.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.072) 0:00:39.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.031) 0:00:39.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.030) 0:00:39.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.029) 0:00:39.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.036) 0:00:39.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.033) 0:00:39.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.031) 0:00:39.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.028) 0:00:39.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.028) 0:00:39.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.031) 0:00:39.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:23:01 +0000 (0:00:00.031) 0:00:39.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:39.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.032) 0:00:39.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.030) 0:00:39.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.030) 0:00:39.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.028) 0:00:39.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:39.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:39.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.031) 0:00:39.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.031) 0:00:39.860 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.031) 0:00:39.892 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:39.921 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:39.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:39.980 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.032) 0:00:40.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.030) 0:00:40.043 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.031) 0:00:40.075 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.031) 0:00:40.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.030) 0:00:40.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.033) 0:00:40.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.035) 0:00:40.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.031) 0:00:40.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.030) 0:00:40.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.030) 0:00:40.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.030) 0:00:40.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:40.359 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.032) 0:00:40.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:61 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.029) 0:00:40.421 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.085) 0:00:40.506 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:23:02 +0000 (0:00:00.050) 0:00:40.557 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.524) 0:00:41.081 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.069) 0:00:41.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.031) 0:00:41.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.030) 0:00:41.212 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.057) 0:00:41.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.025) 0:00:41.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.029) 0:00:41.325 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.032) 0:00:41.358 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test2", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.034) 0:00:41.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.029) 0:00:41.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.029) 0:00:41.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.029) 0:00:41.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.029) 0:00:41.511 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:23:03 +0000 (0:00:00.046) 0:00:41.558 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:23:04 +0000 (0:00:00.065) 0:00:41.624 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:23:05 +0000 (0:00:01.351) 0:00:42.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:23:05 +0000 (0:00:00.029) 0:00:43.005 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:23:05 +0000 (0:00:00.026) 0:00:43.032 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:23:05 +0000 (0:00:00.034) 0:00:43.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:23:05 +0000 (0:00:00.032) 0:00:43.100 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:23:05 +0000 (0:00:00.036) 0:00:43.136 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:23:05 +0000 (0:00:00.383) 0:00:43.520 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:23:06 +0000 (0:00:00.656) 0:00:44.177 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:23:06 +0000 (0:00:00.032) 0:00:44.209 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:23:07 +0000 (0:00:00.644) 0:00:44.853 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:23:07 +0000 (0:00:00.371) 0:00:45.225 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:23:07 +0000 (0:00:00.033) 0:00:45.258 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:72 Wednesday 01 June 2022 16:23:08 +0000 (0:00:00.836) 0:00:46.095 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:23:08 +0000 (0:00:00.061) 0:00:46.156 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:23:08 +0000 (0:00:00.030) 0:00:46.187 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=5aa4ed31-b5cc-4ffb-ba07-7f02795fe95e", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:23:08 +0000 (0:00:00.037) 0:00:46.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:23:08 +0000 (0:00:00.377) 0:00:46.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002530", "end": "2022-06-01 12:23:08.827259", "rc": 0, "start": "2022-06-01 12:23:08.824729" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:23:09 +0000 (0:00:00.372) 0:00:46.975 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003026", "end": "2022-06-01 12:23:09.198254", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:23:09.195228" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:23:09 +0000 (0:00:00.389) 0:00:47.364 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:23:09 +0000 (0:00:00.073) 0:00:47.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:23:09 +0000 (0:00:00.030) 0:00:47.469 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:23:09 +0000 (0:00:00.061) 0:00:47.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:23:09 +0000 (0:00:00.035) 0:00:47.566 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.109) 0:00:47.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.035) 0:00:47.711 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.044) 0:00:47.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.031) 0:00:47.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.035) 0:00:47.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.029) 0:00:47.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.029) 0:00:47.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.029) 0:00:47.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.027) 0:00:47.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.030) 0:00:47.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.043) 0:00:48.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.024) 0:00:48.037 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.033) 0:00:48.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.028) 0:00:48.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.030) 0:00:48.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.030) 0:00:48.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.024) 0:00:48.185 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100584.7571216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100584.7571216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100584.7571216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:23:10 +0000 (0:00:00.402) 0:00:48.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.036) 0:00:48.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.026) 0:00:48.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.032) 0:00:48.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.029) 0:00:48.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.027) 0:00:48.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.029) 0:00:48.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.028) 0:00:48.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.028) 0:00:48.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.026) 0:00:48.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.028) 0:00:48.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.029) 0:00:48.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.028) 0:00:48.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.027) 0:00:48.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.027) 0:00:48.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.033) 0:00:49.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.033) 0:00:49.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.031) 0:00:49.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.029) 0:00:49.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.029) 0:00:49.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.029) 0:00:49.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.028) 0:00:49.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.030) 0:00:49.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.033) 0:00:49.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.030) 0:00:49.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.030) 0:00:49.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.029) 0:00:49.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.028) 0:00:49.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.028) 0:00:49.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.031) 0:00:49.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.030) 0:00:49.484 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.032) 0:00:49.517 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.032) 0:00:49.550 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:23:11 +0000 (0:00:00.031) 0:00:49.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.030) 0:00:49.612 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.064) 0:00:49.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.030) 0:00:49.707 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.033) 0:00:49.740 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.031) 0:00:49.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.029) 0:00:49.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.029) 0:00:49.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.032) 0:00:49.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.030) 0:00:49.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.028) 0:00:49.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.028) 0:00:49.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.027) 0:00:49.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.028) 0:00:50.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.030) 0:00:50.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=260 changed=6 unreachable=0 failed=0 skipped=228 rescued=0 ignored=0 Wednesday 01 June 2022 16:23:12 +0000 (0:00:00.014) 0:00:50.052 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.35s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.28s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:2 -------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:23:13 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:23:14 +0000 (0:00:01.263) 0:00:01.287 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_disk_mount_nvme_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_disk_mount_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:23:14 +0000 (0:00:00.018) 0:00:01.305 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:23:15 +0000 (0:00:00.021) 0:00:00.021 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:23:16 +0000 (0:00:01.284) 0:00:01.306 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_disk_mount_scsi_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_disk_mount_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount_scsi_generated.yml:3 Wednesday 01 June 2022 16:23:16 +0000 (0:00:00.015) 0:00:01.322 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount_scsi_generated.yml:7 Wednesday 01 June 2022 16:23:17 +0000 (0:00:01.103) 0:00:02.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:2 Wednesday 01 June 2022 16:23:17 +0000 (0:00:00.026) 0:00:02.453 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:10 Wednesday 01 June 2022 16:23:18 +0000 (0:00:00.834) 0:00:03.287 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:23:18 +0000 (0:00:00.065) 0:00:03.353 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:23:18 +0000 (0:00:00.223) 0:00:03.576 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:23:19 +0000 (0:00:00.572) 0:00:04.148 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:23:19 +0000 (0:00:00.077) 0:00:04.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:23:19 +0000 (0:00:00.023) 0:00:04.249 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:23:19 +0000 (0:00:00.023) 0:00:04.273 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:23:19 +0000 (0:00:00.188) 0:00:04.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:23:19 +0000 (0:00:00.018) 0:00:04.479 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:23:20 +0000 (0:00:01.060) 0:00:05.540 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:23:20 +0000 (0:00:00.045) 0:00:05.585 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:23:20 +0000 (0:00:00.046) 0:00:05.632 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:23:21 +0000 (0:00:00.657) 0:00:06.290 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:23:21 +0000 (0:00:00.083) 0:00:06.373 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:23:21 +0000 (0:00:00.023) 0:00:06.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:23:21 +0000 (0:00:00.024) 0:00:06.421 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:23:21 +0000 (0:00:00.020) 0:00:06.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:23:22 +0000 (0:00:00.771) 0:00:07.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:23:24 +0000 (0:00:01.816) 0:00:09.030 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.044) 0:00:09.074 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.028) 0:00:09.103 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.515) 0:00:09.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.030) 0:00:09.649 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.027) 0:00:09.676 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.033) 0:00:09.709 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.031) 0:00:09.741 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.033) 0:00:09.775 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:23:24 +0000 (0:00:00.027) 0:00:09.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:23:25 +0000 (0:00:00.027) 0:00:09.830 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:23:25 +0000 (0:00:00.026) 0:00:09.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:23:25 +0000 (0:00:00.027) 0:00:09.884 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:23:25 +0000 (0:00:00.489) 0:00:10.374 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:23:25 +0000 (0:00:00.029) 0:00:10.403 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:13 Wednesday 01 June 2022 16:23:26 +0000 (0:00:00.870) 0:00:11.274 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:20 Wednesday 01 June 2022 16:23:26 +0000 (0:00:00.031) 0:00:11.305 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:23:26 +0000 (0:00:00.044) 0:00:11.349 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.517) 0:00:11.866 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.036) 0:00:11.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.030) 0:00:11.934 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk device mounted at "/opt/test1"] **************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:25 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.033) 0:00:11.967 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.084) 0:00:12.052 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.042) 0:00:12.095 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.500) 0:00:12.595 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.069) 0:00:12.665 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.037) 0:00:12.702 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.033) 0:00:12.735 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:23:27 +0000 (0:00:00.059) 0:00:12.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.025) 0:00:12.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.030) 0:00:12.851 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.032) 0:00:12.883 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.035) 0:00:12.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.029) 0:00:12.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.029) 0:00:12.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.029) 0:00:13.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.031) 0:00:13.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.042) 0:00:13.081 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:23:28 +0000 (0:00:00.026) 0:00:13.107 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:23:29 +0000 (0:00:01.247) 0:00:14.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:23:29 +0000 (0:00:00.030) 0:00:14.385 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:23:29 +0000 (0:00:00.028) 0:00:14.414 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:23:29 +0000 (0:00:00.039) 0:00:14.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:23:29 +0000 (0:00:00.038) 0:00:14.491 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:23:29 +0000 (0:00:00.038) 0:00:14.530 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:23:29 +0000 (0:00:00.028) 0:00:14.558 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:23:30 +0000 (0:00:00.900) 0:00:15.459 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:23:31 +0000 (0:00:00.557) 0:00:16.016 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:23:31 +0000 (0:00:00.634) 0:00:16.650 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:23:32 +0000 (0:00:00.368) 0:00:17.019 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:23:32 +0000 (0:00:00.029) 0:00:17.048 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:35 Wednesday 01 June 2022 16:23:33 +0000 (0:00:00.845) 0:00:17.894 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:23:33 +0000 (0:00:00.052) 0:00:17.947 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:23:33 +0000 (0:00:00.067) 0:00:18.015 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:23:33 +0000 (0:00:00.037) 0:00:18.052 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:23:33 +0000 (0:00:00.467) 0:00:18.520 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002637", "end": "2022-06-01 12:23:33.633134", "rc": 0, "start": "2022-06-01 12:23:33.630497" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.457) 0:00:18.977 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002445", "end": "2022-06-01 12:23:34.003108", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:23:34.000663" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.367) 0:00:19.345 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.027) 0:00:19.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.032) 0:00:19.405 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.062) 0:00:19.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.034) 0:00:19.503 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.119) 0:00:19.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.033) 0:00:19.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.041) 0:00:19.698 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.035) 0:00:19.733 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:23:34 +0000 (0:00:00.035) 0:00:19.768 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.037) 0:00:19.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.028) 0:00:19.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.029) 0:00:19.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.029) 0:00:19.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.033) 0:00:19.928 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.048) 0:00:19.976 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.038) 0:00:20.015 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.035) 0:00:20.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.040) 0:00:20.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.032) 0:00:20.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.037) 0:00:20.160 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.035) 0:00:20.196 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100608.9511216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100608.9511216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100608.9511216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.361) 0:00:20.558 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.038) 0:00:20.596 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.039) 0:00:20.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.035) 0:00:20.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.031) 0:00:20.703 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.036) 0:00:20.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.029) 0:00:20.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:23:35 +0000 (0:00:00.032) 0:00:20.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.028) 0:00:20.830 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.035) 0:00:20.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.028) 0:00:20.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.031) 0:00:20.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.029) 0:00:20.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.033) 0:00:20.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.075) 0:00:21.064 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.039) 0:00:21.103 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.034) 0:00:21.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.029) 0:00:21.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.027) 0:00:21.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.027) 0:00:21.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.031) 0:00:21.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.030) 0:00:21.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.031) 0:00:21.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.030) 0:00:21.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.029) 0:00:21.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.029) 0:00:21.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.040) 0:00:21.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.032) 0:00:21.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.031) 0:00:21.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.030) 0:00:21.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.030) 0:00:21.571 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.032) 0:00:21.603 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.033) 0:00:21.636 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.031) 0:00:21.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.028) 0:00:21.697 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.030) 0:00:21.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.031) 0:00:21.759 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:23:36 +0000 (0:00:00.033) 0:00:21.792 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.034) 0:00:21.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.030) 0:00:21.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.031) 0:00:21.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.030) 0:00:21.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.029) 0:00:21.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.029) 0:00:21.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.032) 0:00:22.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.030) 0:00:22.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.028) 0:00:22.070 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.029) 0:00:22.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the disk device mount location to "/opt/test2"] ******************* task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:37 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.030) 0:00:22.129 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.076) 0:00:22.206 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:23:37 +0000 (0:00:00.047) 0:00:22.253 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.565) 0:00:22.819 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.072) 0:00:22.892 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.031) 0:00:22.923 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.029) 0:00:22.953 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.061) 0:00:23.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.025) 0:00:23.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.029) 0:00:23.070 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.032) 0:00:23.103 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test2", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.035) 0:00:23.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.031) 0:00:23.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.029) 0:00:23.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.031) 0:00:23.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.031) 0:00:23.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.045) 0:00:23.307 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:23:38 +0000 (0:00:00.029) 0:00:23.336 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:23:39 +0000 (0:00:01.106) 0:00:24.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:23:39 +0000 (0:00:00.031) 0:00:24.474 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:23:39 +0000 (0:00:00.028) 0:00:24.502 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:23:39 +0000 (0:00:00.038) 0:00:24.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:23:39 +0000 (0:00:00.033) 0:00:24.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:23:39 +0000 (0:00:00.038) 0:00:24.613 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:23:40 +0000 (0:00:00.404) 0:00:25.017 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:23:40 +0000 (0:00:00.663) 0:00:25.680 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:23:41 +0000 (0:00:00.412) 0:00:26.093 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:23:41 +0000 (0:00:00.646) 0:00:26.740 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:23:42 +0000 (0:00:00.371) 0:00:27.112 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:23:42 +0000 (0:00:00.029) 0:00:27.141 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:47 Wednesday 01 June 2022 16:23:43 +0000 (0:00:00.867) 0:00:28.009 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:23:43 +0000 (0:00:00.066) 0:00:28.076 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:23:43 +0000 (0:00:00.031) 0:00:28.108 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:23:43 +0000 (0:00:00.036) 0:00:28.144 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:23:43 +0000 (0:00:00.369) 0:00:28.513 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003059", "end": "2022-06-01 12:23:43.551619", "rc": 0, "start": "2022-06-01 12:23:43.548560" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.387) 0:00:28.900 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002596", "end": "2022-06-01 12:23:43.922286", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:23:43.919690" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.366) 0:00:29.267 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.072) 0:00:29.339 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.031) 0:00:29.371 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.062) 0:00:29.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.034) 0:00:29.467 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.112) 0:00:29.579 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.035) 0:00:29.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.043) 0:00:29.658 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.037) 0:00:29.696 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.033) 0:00:29.730 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.036) 0:00:29.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:23:44 +0000 (0:00:00.029) 0:00:29.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.029) 0:00:29.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.032) 0:00:29.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.030) 0:00:29.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.048) 0:00:29.937 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.033) 0:00:29.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.036) 0:00:30.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.030) 0:00:30.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.031) 0:00:30.068 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.034) 0:00:30.103 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.036) 0:00:30.139 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100608.9511216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100608.9511216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100608.9511216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.394) 0:00:30.534 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.037) 0:00:30.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.035) 0:00:30.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.033) 0:00:30.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.031) 0:00:30.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.034) 0:00:30.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.028) 0:00:30.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.028) 0:00:30.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:23:45 +0000 (0:00:00.028) 0:00:30.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.035) 0:00:30.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:30.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.029) 0:00:30.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.029) 0:00:30.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.029) 0:00:30.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.029) 0:00:30.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.039) 0:00:31.018 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.039) 0:00:31.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.030) 0:00:31.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.029) 0:00:31.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.030) 0:00:31.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.030) 0:00:31.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.033) 0:00:31.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.034) 0:00:31.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.030) 0:00:31.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.497 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.035) 0:00:31.533 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.030) 0:00:31.564 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.071) 0:00:31.667 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.031) 0:00:31.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.030) 0:00:31.729 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.033) 0:00:31.763 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:23:46 +0000 (0:00:00.032) 0:00:31.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.029) 0:00:31.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.032) 0:00:31.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.029) 0:00:31.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.030) 0:00:31.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.029) 0:00:31.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.029) 0:00:31.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.031) 0:00:32.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.031) 0:00:32.040 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.030) 0:00:32.070 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:49 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.031) 0:00:32.102 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.064) 0:00:32.167 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.045) 0:00:32.213 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.512) 0:00:32.725 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:23:47 +0000 (0:00:00.070) 0:00:32.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.031) 0:00:32.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.035) 0:00:32.863 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.069) 0:00:32.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.027) 0:00:32.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.033) 0:00:32.993 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.033) 0:00:33.027 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test2", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.036) 0:00:33.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.030) 0:00:33.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.029) 0:00:33.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.030) 0:00:33.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.031) 0:00:33.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.046) 0:00:33.232 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:23:48 +0000 (0:00:00.027) 0:00:33.260 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:23:49 +0000 (0:00:01.072) 0:00:34.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:23:49 +0000 (0:00:00.032) 0:00:34.365 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:23:49 +0000 (0:00:00.028) 0:00:34.393 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:23:49 +0000 (0:00:00.036) 0:00:34.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:23:49 +0000 (0:00:00.035) 0:00:34.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:23:49 +0000 (0:00:00.037) 0:00:34.503 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:23:49 +0000 (0:00:00.029) 0:00:34.533 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:23:50 +0000 (0:00:00.660) 0:00:35.193 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:23:50 +0000 (0:00:00.380) 0:00:35.574 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:23:51 +0000 (0:00:00.661) 0:00:36.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:23:51 +0000 (0:00:00.370) 0:00:36.606 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:23:51 +0000 (0:00:00.029) 0:00:36.636 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:59 Wednesday 01 June 2022 16:23:52 +0000 (0:00:00.850) 0:00:37.487 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:23:52 +0000 (0:00:00.059) 0:00:37.546 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:23:52 +0000 (0:00:00.029) 0:00:37.575 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:23:52 +0000 (0:00:00.036) 0:00:37.611 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:23:53 +0000 (0:00:00.397) 0:00:38.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002674", "end": "2022-06-01 12:23:53.023383", "rc": 0, "start": "2022-06-01 12:23:53.020709" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:23:53 +0000 (0:00:00.358) 0:00:38.368 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002460", "end": "2022-06-01 12:23:53.392453", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:23:53.389993" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:23:53 +0000 (0:00:00.367) 0:00:38.736 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:23:53 +0000 (0:00:00.029) 0:00:38.765 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:23:53 +0000 (0:00:00.032) 0:00:38.797 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.061) 0:00:38.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.035) 0:00:38.893 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.110) 0:00:39.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.037) 0:00:39.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "cce6a33b-68fb-40e0-a90c-8e584b8650b0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.041) 0:00:39.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.070) 0:00:39.154 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.035) 0:00:39.189 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.037) 0:00:39.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.030) 0:00:39.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.029) 0:00:39.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.028) 0:00:39.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.032) 0:00:39.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.046) 0:00:39.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.032) 0:00:39.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.038) 0:00:39.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.030) 0:00:39.495 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.030) 0:00:39.526 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.038) 0:00:39.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:23:54 +0000 (0:00:00.039) 0:00:39.603 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100608.9511216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100608.9511216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100608.9511216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.397) 0:00:40.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.038) 0:00:40.040 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.037) 0:00:40.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.033) 0:00:40.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.030) 0:00:40.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.037) 0:00:40.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.031) 0:00:40.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.028) 0:00:40.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.027) 0:00:40.267 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.033) 0:00:40.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.027) 0:00:40.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.033) 0:00:40.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.031) 0:00:40.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.030) 0:00:40.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.030) 0:00:40.455 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.038) 0:00:40.493 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.035) 0:00:40.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.033) 0:00:40.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.029) 0:00:40.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.029) 0:00:40.620 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.031) 0:00:40.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.031) 0:00:40.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.031) 0:00:40.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.034) 0:00:40.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:23:55 +0000 (0:00:00.031) 0:00:40.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.030) 0:00:40.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.029) 0:00:40.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.030) 0:00:40.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.028) 0:00:40.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.031) 0:00:40.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.028) 0:00:40.960 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.030) 0:00:40.990 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.028) 0:00:41.018 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.030) 0:00:41.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.029) 0:00:41.078 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.032) 0:00:41.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.031) 0:00:41.143 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.034) 0:00:41.178 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.032) 0:00:41.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.029) 0:00:41.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.030) 0:00:41.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.034) 0:00:41.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.031) 0:00:41.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.031) 0:00:41.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.031) 0:00:41.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.030) 0:00:41.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.030) 0:00:41.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.037) 0:00:41.498 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:61 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.069) 0:00:41.568 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.078) 0:00:41.646 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:23:56 +0000 (0:00:00.045) 0:00:41.692 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.524) 0:00:42.216 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.072) 0:00:42.289 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.032) 0:00:42.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.032) 0:00:42.353 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.062) 0:00:42.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.026) 0:00:42.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.031) 0:00:42.474 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.033) 0:00:42.507 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test2", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.036) 0:00:42.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.029) 0:00:42.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.028) 0:00:42.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.028) 0:00:42.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.032) 0:00:42.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.050) 0:00:42.713 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:23:57 +0000 (0:00:00.028) 0:00:42.741 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:23:59 +0000 (0:00:01.337) 0:00:44.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:23:59 +0000 (0:00:00.031) 0:00:44.111 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:23:59 +0000 (0:00:00.027) 0:00:44.138 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:23:59 +0000 (0:00:00.037) 0:00:44.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:23:59 +0000 (0:00:00.034) 0:00:44.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:23:59 +0000 (0:00:00.035) 0:00:44.245 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:23:59 +0000 (0:00:00.392) 0:00:44.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:24:00 +0000 (0:00:00.649) 0:00:45.288 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:24:00 +0000 (0:00:00.031) 0:00:45.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:24:01 +0000 (0:00:00.627) 0:00:45.946 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:24:01 +0000 (0:00:00.363) 0:00:46.309 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:24:01 +0000 (0:00:00.029) 0:00:46.339 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:72 Wednesday 01 June 2022 16:24:02 +0000 (0:00:00.856) 0:00:47.195 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:24:02 +0000 (0:00:00.063) 0:00:47.258 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:24:02 +0000 (0:00:00.029) 0:00:47.288 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=cce6a33b-68fb-40e0-a90c-8e584b8650b0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:24:02 +0000 (0:00:00.039) 0:00:47.328 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:24:02 +0000 (0:00:00.385) 0:00:47.714 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002681", "end": "2022-06-01 12:24:02.747349", "rc": 0, "start": "2022-06-01 12:24:02.744668" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:24:03 +0000 (0:00:00.379) 0:00:48.093 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002664", "end": "2022-06-01 12:24:03.112802", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:24:03.110138" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:24:03 +0000 (0:00:00.365) 0:00:48.459 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:24:03 +0000 (0:00:00.030) 0:00:48.490 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:24:03 +0000 (0:00:00.030) 0:00:48.520 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:24:03 +0000 (0:00:00.057) 0:00:48.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:24:03 +0000 (0:00:00.034) 0:00:48.612 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:24:03 +0000 (0:00:00.114) 0:00:48.727 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.078) 0:00:48.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.040) 0:00:48.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.030) 0:00:48.877 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.034) 0:00:48.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.029) 0:00:48.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.030) 0:00:48.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.031) 0:00:49.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.030) 0:00:49.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.030) 0:00:49.064 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.045) 0:00:49.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.026) 0:00:49.136 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.036) 0:00:49.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.032) 0:00:49.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.031) 0:00:49.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.029) 0:00:49.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.025) 0:00:49.292 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100638.6691215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100638.6691215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654100638.6691215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.386) 0:00:49.678 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.037) 0:00:49.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.026) 0:00:49.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:24:04 +0000 (0:00:00.033) 0:00:49.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:49.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.025) 0:00:49.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.027) 0:00:49.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.027) 0:00:49.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.029) 0:00:49.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.027) 0:00:49.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:49.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:50.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.031) 0:00:50.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:50.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:50.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.039) 0:00:50.137 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.034) 0:00:50.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.029) 0:00:50.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.028) 0:00:50.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.028) 0:00:50.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.029) 0:00:50.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.040) 0:00:50.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.031) 0:00:50.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.031) 0:00:50.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.032) 0:00:50.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.031) 0:00:50.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:50.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.034) 0:00:50.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.033) 0:00:50.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.028) 0:00:50.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:50.614 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.033) 0:00:50.648 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.031) 0:00:50.679 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.033) 0:00:50.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.031) 0:00:50.744 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:24:05 +0000 (0:00:00.030) 0:00:50.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.030) 0:00:50.805 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.032) 0:00:50.837 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.030) 0:00:50.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.032) 0:00:50.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.047) 0:00:50.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.030) 0:00:50.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.030) 0:00:51.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.029) 0:00:51.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.027) 0:00:51.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.029) 0:00:51.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.031) 0:00:51.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.030) 0:00:51.157 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=262 changed=6 unreachable=0 failed=0 skipped=228 rescued=0 ignored=0 Wednesday 01 June 2022 16:24:06 +0000 (0:00:00.015) 0:00:51.172 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.25s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.10s /tmp/tmp7247_7fr/tests/tests_change_disk_mount_scsi_generated.yml:3 ----------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.83s /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml:2 -------------------------- linux-system-roles.storage : make sure required packages are installed --- 0.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : get required packages ---------------------- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:24:07 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:24:08 +0000 (0:00:01.276) 0:00:01.299 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_fs.yml ************************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_change_fs.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:2 Wednesday 01 June 2022 16:24:08 +0000 (0:00:00.016) 0:00:01.316 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:13 Wednesday 01 June 2022 16:24:09 +0000 (0:00:01.083) 0:00:02.399 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:24:09 +0000 (0:00:00.040) 0:00:02.439 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:24:09 +0000 (0:00:00.152) 0:00:02.592 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:24:10 +0000 (0:00:00.529) 0:00:03.121 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:24:10 +0000 (0:00:00.073) 0:00:03.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:24:10 +0000 (0:00:00.022) 0:00:03.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:24:10 +0000 (0:00:00.021) 0:00:03.238 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:24:10 +0000 (0:00:00.190) 0:00:03.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:24:10 +0000 (0:00:00.019) 0:00:03.449 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:24:11 +0000 (0:00:01.079) 0:00:04.528 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:24:11 +0000 (0:00:00.046) 0:00:04.575 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:24:11 +0000 (0:00:00.045) 0:00:04.620 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:24:12 +0000 (0:00:00.665) 0:00:05.286 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:24:12 +0000 (0:00:00.078) 0:00:05.364 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:24:12 +0000 (0:00:00.021) 0:00:05.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:24:12 +0000 (0:00:00.023) 0:00:05.410 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:24:12 +0000 (0:00:00.024) 0:00:05.435 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:24:13 +0000 (0:00:00.823) 0:00:06.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:24:15 +0000 (0:00:01.766) 0:00:08.025 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.043) 0:00:08.068 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.028) 0:00:08.097 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.501) 0:00:08.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.056) 0:00:08.654 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.028) 0:00:08.683 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.034) 0:00:08.718 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.033) 0:00:08.751 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.031) 0:00:08.783 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.026) 0:00:08.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.030) 0:00:08.841 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:24:15 +0000 (0:00:00.027) 0:00:08.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:24:16 +0000 (0:00:00.028) 0:00:08.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:24:16 +0000 (0:00:00.449) 0:00:09.347 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:24:16 +0000 (0:00:00.028) 0:00:09.375 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:16 Wednesday 01 June 2022 16:24:17 +0000 (0:00:00.808) 0:00:10.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:23 Wednesday 01 June 2022 16:24:17 +0000 (0:00:00.029) 0:00:10.212 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:24:17 +0000 (0:00:00.044) 0:00:10.257 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:24:17 +0000 (0:00:00.498) 0:00:10.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:24:17 +0000 (0:00:00.037) 0:00:10.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:24:17 +0000 (0:00:00.033) 0:00:10.826 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a LVM logical volume with default fs_type] ************************ task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:28 Wednesday 01 June 2022 16:24:17 +0000 (0:00:00.034) 0:00:10.861 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.055) 0:00:10.916 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.042) 0:00:10.959 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.498) 0:00:11.458 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.067) 0:00:11.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.029) 0:00:11.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.030) 0:00:11.586 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.060) 0:00:11.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.025) 0:00:11.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.029) 0:00:11.701 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.036) 0:00:11.737 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.032) 0:00:11.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.061) 0:00:11.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:24:18 +0000 (0:00:00.030) 0:00:11.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:24:19 +0000 (0:00:00.028) 0:00:11.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:24:19 +0000 (0:00:00.027) 0:00:11.918 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:24:19 +0000 (0:00:00.038) 0:00:11.956 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:24:19 +0000 (0:00:00.024) 0:00:11.981 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:24:20 +0000 (0:00:01.740) 0:00:13.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:24:20 +0000 (0:00:00.031) 0:00:13.753 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:24:20 +0000 (0:00:00.027) 0:00:13.781 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:24:20 +0000 (0:00:00.038) 0:00:13.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:24:20 +0000 (0:00:00.036) 0:00:13.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:24:21 +0000 (0:00:00.032) 0:00:13.887 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:24:21 +0000 (0:00:00.027) 0:00:13.915 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:24:21 +0000 (0:00:00.929) 0:00:14.845 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:24:22 +0000 (0:00:00.553) 0:00:15.398 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:24:23 +0000 (0:00:00.664) 0:00:16.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:24:23 +0000 (0:00:00.357) 0:00:16.420 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:24:23 +0000 (0:00:00.029) 0:00:16.450 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:40 Wednesday 01 June 2022 16:24:24 +0000 (0:00:00.833) 0:00:17.283 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:24:24 +0000 (0:00:00.053) 0:00:17.337 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:24:24 +0000 (0:00:00.040) 0:00:17.378 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:24:24 +0000 (0:00:00.028) 0:00:17.407 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ydjcN2-chtV-20L3-dshY-r0lz-qRYX-0hXNBI" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:24:25 +0000 (0:00:00.494) 0:00:17.901 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002675", "end": "2022-06-01 12:24:24.986817", "rc": 0, "start": "2022-06-01 12:24:24.984142" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:24:25 +0000 (0:00:00.508) 0:00:18.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002727", "end": "2022-06-01 12:24:25.350342", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:24:25.347615" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:24:25 +0000 (0:00:00.364) 0:00:18.774 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:24:25 +0000 (0:00:00.065) 0:00:18.840 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:24:25 +0000 (0:00:00.031) 0:00:18.871 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.062) 0:00:18.934 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.037) 0:00:18.971 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.468) 0:00:19.440 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.042) 0:00:19.483 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.038) 0:00:19.521 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.035) 0:00:19.557 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.037) 0:00:19.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.030) 0:00:19.624 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.043) 0:00:19.668 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.058) 0:00:19.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.031) 0:00:19.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.029) 0:00:19.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.029) 0:00:19.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.030) 0:00:19.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:24:26 +0000 (0:00:00.029) 0:00:19.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.029) 0:00:19.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.030) 0:00:19.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.037) 0:00:19.974 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.067) 0:00:20.041 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.062) 0:00:20.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.031) 0:00:20.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.031) 0:00:20.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.030) 0:00:20.197 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.059) 0:00:20.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.036) 0:00:20.294 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.035) 0:00:20.329 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.056) 0:00:20.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.035) 0:00:20.422 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.035) 0:00:20.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.028) 0:00:20.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.028) 0:00:20.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.032) 0:00:20.547 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.031) 0:00:20.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.028) 0:00:20.606 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.063) 0:00:20.670 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.065) 0:00:20.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.036) 0:00:20.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.030) 0:00:20.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:24:27 +0000 (0:00:00.029) 0:00:20.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.067) 0:00:20.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.029) 0:00:20.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.031) 0:00:20.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.030) 0:00:20.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.030) 0:00:21.022 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.030) 0:00:21.052 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.066) 0:00:21.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.034) 0:00:21.153 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.122) 0:00:21.276 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.037) 0:00:21.313 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.041) 0:00:21.355 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.039) 0:00:21.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.038) 0:00:21.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.038) 0:00:21.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.032) 0:00:21.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.031) 0:00:21.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.030) 0:00:21.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.030) 0:00:21.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.048) 0:00:21.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.035) 0:00:21.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.037) 0:00:21.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.032) 0:00:21.750 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.032) 0:00:21.783 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.036) 0:00:21.820 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:24:28 +0000 (0:00:00.039) 0:00:21.859 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100660.2191215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100660.2191215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1191, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100660.2191215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.377) 0:00:22.237 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.037) 0:00:22.275 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.036) 0:00:22.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.033) 0:00:22.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.029) 0:00:22.374 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.034) 0:00:22.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.032) 0:00:22.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.037) 0:00:22.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.029) 0:00:22.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.033) 0:00:22.693 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.038) 0:00:22.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.035) 0:00:22.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:24:29 +0000 (0:00:00.030) 0:00:22.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.033) 0:00:22.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.032) 0:00:22.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.030) 0:00:22.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.031) 0:00:22.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.035) 0:00:23.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.030) 0:00:23.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.035) 0:00:23.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.070) 0:00:23.160 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:24:30 +0000 (0:00:00.503) 0:00:23.664 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.372) 0:00:24.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.038) 0:00:24.074 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.032) 0:00:24.107 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.032) 0:00:24.139 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.030) 0:00:24.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.032) 0:00:24.202 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.030) 0:00:24.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.030) 0:00:24.262 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.033) 0:00:24.296 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.032) 0:00:24.328 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.038) 0:00:24.367 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.043716", "end": "2022-06-01 12:24:31.365496", "rc": 0, "start": "2022-06-01 12:24:31.321780" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.429) 0:00:24.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.040) 0:00:24.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:24:31 +0000 (0:00:00.039) 0:00:24.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.032) 0:00:24.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.031) 0:00:24.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.031) 0:00:24.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.035) 0:00:25.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.035) 0:00:25.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.031) 0:00:25.074 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.029) 0:00:25.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the file system signature on the logical volume created above] **** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:42 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.029) 0:00:25.133 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.063) 0:00:25.196 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.043) 0:00:25.240 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.550) 0:00:25.790 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:24:32 +0000 (0:00:00.072) 0:00:25.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.032) 0:00:25.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.032) 0:00:25.928 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.068) 0:00:25.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.028) 0:00:26.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.032) 0:00:26.058 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "fs_type": "xfs", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.040) 0:00:26.098 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.033) 0:00:26.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.032) 0:00:26.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.070) 0:00:26.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.030) 0:00:26.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.029) 0:00:26.294 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.043) 0:00:26.338 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:24:33 +0000 (0:00:00.028) 0:00:26.366 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:24:34 +0000 (0:00:01.353) 0:00:27.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:24:34 +0000 (0:00:00.033) 0:00:27.753 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:24:34 +0000 (0:00:00.029) 0:00:27.782 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:24:34 +0000 (0:00:00.040) 0:00:27.822 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:24:34 +0000 (0:00:00.038) 0:00:27.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:24:35 +0000 (0:00:00.034) 0:00:27.895 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:24:35 +0000 (0:00:00.029) 0:00:27.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:24:35 +0000 (0:00:00.661) 0:00:28.586 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:24:36 +0000 (0:00:00.393) 0:00:28.980 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:24:36 +0000 (0:00:00.655) 0:00:29.635 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:24:37 +0000 (0:00:00.360) 0:00:29.995 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:24:37 +0000 (0:00:00.030) 0:00:30.026 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:55 Wednesday 01 June 2022 16:24:38 +0000 (0:00:00.894) 0:00:30.920 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:24:38 +0000 (0:00:00.056) 0:00:30.977 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:24:38 +0000 (0:00:00.038) 0:00:31.015 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:24:38 +0000 (0:00:00.027) 0:00:31.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ydjcN2-chtV-20L3-dshY-r0lz-qRYX-0hXNBI" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:24:38 +0000 (0:00:00.405) 0:00:31.448 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003106", "end": "2022-06-01 12:24:38.401439", "rc": 0, "start": "2022-06-01 12:24:38.398333" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:24:38 +0000 (0:00:00.382) 0:00:31.831 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002558", "end": "2022-06-01 12:24:38.777889", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:24:38.775331" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.379) 0:00:32.210 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.066) 0:00:32.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.032) 0:00:32.309 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.063) 0:00:32.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.039) 0:00:32.412 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.382) 0:00:32.795 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.042) 0:00:32.837 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:24:39 +0000 (0:00:00.038) 0:00:32.875 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.033) 0:00:32.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.037) 0:00:32.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.031) 0:00:32.979 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.042) 0:00:33.021 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.055) 0:00:33.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.032) 0:00:33.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.031) 0:00:33.141 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.030) 0:00:33.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.029) 0:00:33.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.029) 0:00:33.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.030) 0:00:33.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.031) 0:00:33.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.030) 0:00:33.324 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.057) 0:00:33.381 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.059) 0:00:33.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.032) 0:00:33.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.030) 0:00:33.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.029) 0:00:33.533 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.062) 0:00:33.595 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.034) 0:00:33.630 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.035) 0:00:33.665 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.057) 0:00:33.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.036) 0:00:33.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.033) 0:00:33.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.028) 0:00:33.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:24:40 +0000 (0:00:00.042) 0:00:33.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.037) 0:00:33.901 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.033) 0:00:33.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.036) 0:00:33.972 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.064) 0:00:34.037 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.064) 0:00:34.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.080) 0:00:34.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.032) 0:00:34.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.031) 0:00:34.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.030) 0:00:34.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.031) 0:00:34.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.031) 0:00:34.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.035) 0:00:34.372 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.032) 0:00:34.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.032) 0:00:34.438 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.060) 0:00:34.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.038) 0:00:34.538 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.122) 0:00:34.660 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.037) 0:00:34.698 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.043) 0:00:34.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.037) 0:00:34.779 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.036) 0:00:34.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:24:41 +0000 (0:00:00.039) 0:00:34.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.031) 0:00:34.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.030) 0:00:34.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.028) 0:00:34.945 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.029) 0:00:34.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.043) 0:00:35.018 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.033) 0:00:35.051 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.033) 0:00:35.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.028) 0:00:35.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.029) 0:00:35.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.034) 0:00:35.177 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.038) 0:00:35.215 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100660.2191215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100660.2191215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1191, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100660.2191215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.394) 0:00:35.610 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.039) 0:00:35.650 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.036) 0:00:35.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.035) 0:00:35.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.037) 0:00:35.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.037) 0:00:35.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.031) 0:00:35.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:24:42 +0000 (0:00:00.033) 0:00:35.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.035) 0:00:35.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.038) 0:00:35.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.031) 0:00:35.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.031) 0:00:35.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.031) 0:00:36.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.033) 0:00:36.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.030) 0:00:36.094 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.036) 0:00:36.130 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.033) 0:00:36.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.029) 0:00:36.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.029) 0:00:36.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.030) 0:00:36.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.030) 0:00:36.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.029) 0:00:36.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.029) 0:00:36.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.030) 0:00:36.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.030) 0:00:36.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.070) 0:00:36.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.031) 0:00:36.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:24:43 +0000 (0:00:00.032) 0:00:36.539 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.372) 0:00:36.912 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.378) 0:00:37.291 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.038) 0:00:37.330 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.033) 0:00:37.363 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.030) 0:00:37.394 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.030) 0:00:37.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.032) 0:00:37.458 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.030) 0:00:37.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.030) 0:00:37.519 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.037) 0:00:37.556 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.034) 0:00:37.590 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:24:44 +0000 (0:00:00.039) 0:00:37.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036395", "end": "2022-06-01 12:24:44.614750", "rc": 0, "start": "2022-06-01 12:24:44.578355" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.412) 0:00:38.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.040) 0:00:38.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.041) 0:00:38.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.047) 0:00:38.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.041) 0:00:38.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.033) 0:00:38.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.034) 0:00:38.281 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.036) 0:00:38.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.031) 0:00:38.349 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.028) 0:00:38.377 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Re-run the role on the same volume without specifying fs_type] *********** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:57 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.030) 0:00:38.408 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.069) 0:00:38.477 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:24:45 +0000 (0:00:00.044) 0:00:38.522 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.524) 0:00:39.047 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.073) 0:00:39.121 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.031) 0:00:39.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.032) 0:00:39.185 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.060) 0:00:39.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.026) 0:00:39.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.029) 0:00:39.301 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.036) 0:00:39.337 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.033) 0:00:39.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.029) 0:00:39.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.031) 0:00:39.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.098) 0:00:39.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.032) 0:00:39.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.043) 0:00:39.606 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:24:46 +0000 (0:00:00.026) 0:00:39.632 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:24:48 +0000 (0:00:01.344) 0:00:40.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:24:48 +0000 (0:00:00.031) 0:00:41.009 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:24:48 +0000 (0:00:00.028) 0:00:41.037 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:24:48 +0000 (0:00:00.043) 0:00:41.080 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:24:48 +0000 (0:00:00.037) 0:00:41.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:24:48 +0000 (0:00:00.034) 0:00:41.152 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:24:48 +0000 (0:00:00.029) 0:00:41.182 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:24:48 +0000 (0:00:00.631) 0:00:41.813 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:24:49 +0000 (0:00:00.388) 0:00:42.202 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:24:49 +0000 (0:00:00.671) 0:00:42.873 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:24:50 +0000 (0:00:00.374) 0:00:43.247 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:24:50 +0000 (0:00:00.029) 0:00:43.277 ******** ok: [/cache/rhel-x.qcow2] TASK [Verify the output of the duplicate volumes test] ************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:69 Wednesday 01 June 2022 16:24:51 +0000 (0:00:00.839) 0:00:44.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:74 Wednesday 01 June 2022 16:24:51 +0000 (0:00:00.037) 0:00:44.154 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:24:51 +0000 (0:00:00.061) 0:00:44.215 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:24:51 +0000 (0:00:00.043) 0:00:44.258 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:24:51 +0000 (0:00:00.031) 0:00:44.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ydjcN2-chtV-20L3-dshY-r0lz-qRYX-0hXNBI" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:24:51 +0000 (0:00:00.378) 0:00:44.669 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002492", "end": "2022-06-01 12:24:51.598632", "rc": 0, "start": "2022-06-01 12:24:51.596140" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:24:52 +0000 (0:00:00.356) 0:00:45.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002329", "end": "2022-06-01 12:24:51.967435", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:24:51.965106" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:24:52 +0000 (0:00:00.368) 0:00:45.394 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:24:52 +0000 (0:00:00.070) 0:00:45.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:24:52 +0000 (0:00:00.032) 0:00:45.498 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:24:52 +0000 (0:00:00.065) 0:00:45.563 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:24:52 +0000 (0:00:00.093) 0:00:45.657 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.386) 0:00:46.043 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.043) 0:00:46.086 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.038) 0:00:46.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.035) 0:00:46.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.035) 0:00:46.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.031) 0:00:46.227 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.041) 0:00:46.269 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.055) 0:00:46.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.030) 0:00:46.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.031) 0:00:46.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.029) 0:00:46.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.030) 0:00:46.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.029) 0:00:46.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.030) 0:00:46.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.030) 0:00:46.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.032) 0:00:46.569 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.057) 0:00:46.626 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.062) 0:00:46.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.033) 0:00:46.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.030) 0:00:46.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.029) 0:00:46.783 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:24:53 +0000 (0:00:00.060) 0:00:46.843 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.049) 0:00:46.893 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.040) 0:00:46.934 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.059) 0:00:46.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.036) 0:00:47.030 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.034) 0:00:47.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.033) 0:00:47.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.029) 0:00:47.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.030) 0:00:47.158 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.032) 0:00:47.191 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.033) 0:00:47.225 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.063) 0:00:47.288 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.061) 0:00:47.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.030) 0:00:47.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.032) 0:00:47.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.030) 0:00:47.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.029) 0:00:47.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.029) 0:00:47.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.030) 0:00:47.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.034) 0:00:47.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.033) 0:00:47.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.031) 0:00:47.632 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.061) 0:00:47.693 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:24:54 +0000 (0:00:00.036) 0:00:47.730 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.182) 0:00:47.913 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.036) 0:00:47.949 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.042) 0:00:47.992 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.037) 0:00:48.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.035) 0:00:48.065 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.040) 0:00:48.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.047) 0:00:48.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.037) 0:00:48.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.031) 0:00:48.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.032) 0:00:48.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.045) 0:00:48.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.035) 0:00:48.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.038) 0:00:48.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.029) 0:00:48.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.030) 0:00:48.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.037) 0:00:48.472 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.037) 0:00:48.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100660.2191215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100660.2191215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1191, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100660.2191215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:24:55 +0000 (0:00:00.371) 0:00:48.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.038) 0:00:48.920 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.036) 0:00:48.957 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.034) 0:00:48.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.029) 0:00:49.021 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.034) 0:00:49.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.029) 0:00:49.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.029) 0:00:49.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.032) 0:00:49.147 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.043) 0:00:49.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.031) 0:00:49.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.031) 0:00:49.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.030) 0:00:49.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.030) 0:00:49.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.031) 0:00:49.346 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.041) 0:00:49.387 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.035) 0:00:49.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.032) 0:00:49.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.030) 0:00:49.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.047) 0:00:49.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.034) 0:00:49.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.035) 0:00:49.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.034) 0:00:49.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.031) 0:00:49.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.035) 0:00:49.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.035) 0:00:49.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.031) 0:00:49.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:24:56 +0000 (0:00:00.032) 0:00:49.805 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.358) 0:00:50.164 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.358) 0:00:50.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.087) 0:00:50.610 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.033) 0:00:50.644 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.031) 0:00:50.675 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.033) 0:00:50.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.032) 0:00:50.741 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.039) 0:00:50.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.036) 0:00:50.817 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:24:57 +0000 (0:00:00.036) 0:00:50.853 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.033) 0:00:50.887 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.038) 0:00:50.926 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.031484", "end": "2022-06-01 12:24:57.895485", "rc": 0, "start": "2022-06-01 12:24:57.864001" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.395) 0:00:51.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.039) 0:00:51.361 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.039) 0:00:51.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.033) 0:00:51.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.034) 0:00:51.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.032) 0:00:51.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.033) 0:00:51.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.034) 0:00:51.569 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.031) 0:00:51.600 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.031) 0:00:51.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:76 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.031) 0:00:51.663 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.071) 0:00:51.735 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:24:58 +0000 (0:00:00.045) 0:00:51.780 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.521) 0:00:52.301 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.072) 0:00:52.374 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.029) 0:00:52.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.032) 0:00:52.436 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.060) 0:00:52.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.028) 0:00:52.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.032) 0:00:52.558 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "fs_type": "xfs", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.040) 0:00:52.598 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.032) 0:00:52.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.030) 0:00:52.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.030) 0:00:52.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.030) 0:00:52.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.036) 0:00:52.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.046) 0:00:52.806 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:24:59 +0000 (0:00:00.028) 0:00:52.835 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:25:01 +0000 (0:00:01.336) 0:00:54.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:25:01 +0000 (0:00:00.029) 0:00:54.201 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:25:01 +0000 (0:00:00.030) 0:00:54.231 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:25:01 +0000 (0:00:00.039) 0:00:54.271 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:25:01 +0000 (0:00:00.038) 0:00:54.310 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:25:01 +0000 (0:00:00.033) 0:00:54.343 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:25:01 +0000 (0:00:00.029) 0:00:54.373 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:25:02 +0000 (0:00:00.649) 0:00:55.022 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:25:02 +0000 (0:00:00.370) 0:00:55.393 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:25:03 +0000 (0:00:00.684) 0:00:56.077 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:25:03 +0000 (0:00:00.362) 0:00:56.440 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:25:03 +0000 (0:00:00.030) 0:00:56.470 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:89 Wednesday 01 June 2022 16:25:04 +0000 (0:00:00.841) 0:00:57.312 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:25:04 +0000 (0:00:00.062) 0:00:57.375 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:25:04 +0000 (0:00:00.037) 0:00:57.412 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:25:04 +0000 (0:00:00.027) 0:00:57.439 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ydjcN2-chtV-20L3-dshY-r0lz-qRYX-0hXNBI" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:25:04 +0000 (0:00:00.358) 0:00:57.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002696", "end": "2022-06-01 12:25:04.736624", "rc": 0, "start": "2022-06-01 12:25:04.733928" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:25:05 +0000 (0:00:00.377) 0:00:58.176 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002654", "end": "2022-06-01 12:25:05.121107", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:25:05.118453" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:25:05 +0000 (0:00:00.373) 0:00:58.549 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:25:05 +0000 (0:00:00.064) 0:00:58.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:25:05 +0000 (0:00:00.064) 0:00:58.678 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:25:05 +0000 (0:00:00.063) 0:00:58.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:25:05 +0000 (0:00:00.039) 0:00:58.782 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.363) 0:00:59.146 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.039) 0:00:59.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.038) 0:00:59.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.036) 0:00:59.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.035) 0:00:59.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.029) 0:00:59.327 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.050) 0:00:59.377 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.069) 0:00:59.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.033) 0:00:59.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.033) 0:00:59.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.031) 0:00:59.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.033) 0:00:59.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.032) 0:00:59.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.034) 0:00:59.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.032) 0:00:59.678 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.032) 0:00:59.710 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.059) 0:00:59.769 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.063) 0:00:59.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:25:06 +0000 (0:00:00.033) 0:00:59.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.031) 0:00:59.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.031) 0:00:59.930 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.063) 0:00:59.993 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.035) 0:01:00.029 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.034) 0:01:00.064 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.053) 0:01:00.117 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.038) 0:01:00.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.037) 0:01:00.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.029) 0:01:00.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.028) 0:01:00.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.030) 0:01:00.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.032) 0:01:00.314 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.034) 0:01:00.349 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.065) 0:01:00.414 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.061) 0:01:00.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.035) 0:01:00.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.032) 0:01:00.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.031) 0:01:00.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.030) 0:01:00.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.030) 0:01:00.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.032) 0:01:00.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.031) 0:01:00.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.032) 0:01:00.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.035) 0:01:00.768 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:25:07 +0000 (0:00:00.072) 0:01:00.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.080) 0:01:00.922 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.115) 0:01:01.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.035) 0:01:01.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "a98ebfa1-42c9-40ec-8ec7-14c84a569f5f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.042) 0:01:01.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.037) 0:01:01.153 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.034) 0:01:01.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.037) 0:01:01.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.029) 0:01:01.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.028) 0:01:01.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.029) 0:01:01.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.030) 0:01:01.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.050) 0:01:01.393 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.033) 0:01:01.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.035) 0:01:01.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.029) 0:01:01.492 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.032) 0:01:01.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.038) 0:01:01.563 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:25:08 +0000 (0:00:00.040) 0:01:01.603 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100660.2191215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100660.2191215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1191, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100660.2191215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.392) 0:01:01.996 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.036) 0:01:02.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.035) 0:01:02.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.032) 0:01:02.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.030) 0:01:02.131 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.036) 0:01:02.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.031) 0:01:02.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.030) 0:01:02.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.027) 0:01:02.257 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.036) 0:01:02.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.027) 0:01:02.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.027) 0:01:02.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.029) 0:01:02.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.029) 0:01:02.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.030) 0:01:02.439 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.037) 0:01:02.477 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.034) 0:01:02.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.029) 0:01:02.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.032) 0:01:02.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.032) 0:01:02.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.032) 0:01:02.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.032) 0:01:02.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.031) 0:01:02.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.031) 0:01:02.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.036) 0:01:02.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.031) 0:01:02.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.036) 0:01:02.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:25:09 +0000 (0:00:00.038) 0:01:02.877 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.376) 0:01:03.253 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.374) 0:01:03.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.039) 0:01:03.667 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.034) 0:01:03.701 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.031) 0:01:03.733 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.031) 0:01:03.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.033) 0:01:03.798 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.031) 0:01:03.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:25:10 +0000 (0:00:00.030) 0:01:03.861 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.034) 0:01:03.895 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.033) 0:01:03.929 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.039) 0:01:03.969 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.031077", "end": "2022-06-01 12:25:10.943518", "rc": 0, "start": "2022-06-01 12:25:10.912441" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.402) 0:01:04.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.038) 0:01:04.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.040) 0:01:04.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.033) 0:01:04.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.034) 0:01:04.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.033) 0:01:04.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.033) 0:01:04.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.035) 0:01:04.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.031) 0:01:04.652 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.027) 0:01:04.680 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:91 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.030) 0:01:04.710 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.088) 0:01:04.799 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:25:11 +0000 (0:00:00.045) 0:01:04.845 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.519) 0:01:05.364 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.073) 0:01:05.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.035) 0:01:05.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.034) 0:01:05.507 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.065) 0:01:05.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.027) 0:01:05.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.032) 0:01:05.633 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "fs_type": "xfs", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.040) 0:01:05.674 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.034) 0:01:05.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.034) 0:01:05.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.035) 0:01:05.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.035) 0:01:05.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:25:12 +0000 (0:00:00.031) 0:01:05.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:25:13 +0000 (0:00:00.044) 0:01:05.890 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:25:13 +0000 (0:00:00.029) 0:01:05.919 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:25:14 +0000 (0:00:01.934) 0:01:07.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:25:15 +0000 (0:00:00.032) 0:01:07.886 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:25:15 +0000 (0:00:00.028) 0:01:07.914 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:25:15 +0000 (0:00:00.040) 0:01:07.955 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:25:15 +0000 (0:00:00.039) 0:01:07.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:25:15 +0000 (0:00:00.037) 0:01:08.032 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:25:15 +0000 (0:00:00.391) 0:01:08.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:25:16 +0000 (0:00:00.653) 0:01:09.078 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:25:16 +0000 (0:00:00.038) 0:01:09.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:25:16 +0000 (0:00:00.673) 0:01:09.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:25:17 +0000 (0:00:00.376) 0:01:10.166 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:25:17 +0000 (0:00:00.031) 0:01:10.198 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:105 Wednesday 01 June 2022 16:25:18 +0000 (0:00:00.857) 0:01:11.056 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:25:18 +0000 (0:00:00.068) 0:01:11.124 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:25:18 +0000 (0:00:00.040) 0:01:11.165 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:25:18 +0000 (0:00:00.029) 0:01:11.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:25:18 +0000 (0:00:00.379) 0:01:11.573 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002826", "end": "2022-06-01 12:25:18.533721", "rc": 0, "start": "2022-06-01 12:25:18.530895" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.391) 0:01:11.965 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002907", "end": "2022-06-01 12:25:18.931424", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:25:18.928517" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.400) 0:01:12.365 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.070) 0:01:12.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.098) 0:01:12.534 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.068) 0:01:12.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.043) 0:01:12.646 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.032) 0:01:12.679 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.032) 0:01:12.711 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.048) 0:01:12.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:25:19 +0000 (0:00:00.069) 0:01:12.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.066) 0:01:12.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.063) 0:01:12.959 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.046) 0:01:13.006 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.059) 0:01:13.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.031) 0:01:13.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.029) 0:01:13.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.028) 0:01:13.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.030) 0:01:13.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.033) 0:01:13.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.029) 0:01:13.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.028) 0:01:13.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.030) 0:01:13.308 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.056) 0:01:13.365 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.067) 0:01:13.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.031) 0:01:13.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.031) 0:01:13.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.030) 0:01:13.525 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.062) 0:01:13.588 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.034) 0:01:13.623 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.029) 0:01:13.652 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.030) 0:01:13.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.031) 0:01:13.714 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.064) 0:01:13.778 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.064) 0:01:13.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:25:20 +0000 (0:00:00.033) 0:01:13.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.030) 0:01:13.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.030) 0:01:13.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.030) 0:01:13.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.030) 0:01:13.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.029) 0:01:14.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.031) 0:01:14.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.031) 0:01:14.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.030) 0:01:14.119 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.055) 0:01:14.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.036) 0:01:14.212 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.119) 0:01:14.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.039) 0:01:14.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.039) 0:01:14.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.031) 0:01:14.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.035) 0:01:14.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.028) 0:01:14.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.029) 0:01:14.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.031) 0:01:14.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.070) 0:01:14.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.032) 0:01:14.671 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.046) 0:01:14.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.026) 0:01:14.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.035) 0:01:14.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.029) 0:01:14.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.033) 0:01:14.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:25:21 +0000 (0:00:00.029) 0:01:14.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.026) 0:01:14.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.361) 0:01:15.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.035) 0:01:15.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.027) 0:01:15.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.034) 0:01:15.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.033) 0:01:15.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.036) 0:01:15.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.039) 0:01:15.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.034) 0:01:15.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.034) 0:01:15.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.027) 0:01:15.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.033) 0:01:15.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.031) 0:01:15.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.030) 0:01:15.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.030) 0:01:15.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.030) 0:01:15.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.045) 0:01:15.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.040) 0:01:15.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.030) 0:01:15.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:25:22 +0000 (0:00:00.030) 0:01:15.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.029) 0:01:15.898 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.032) 0:01:15.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:15.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.034) 0:01:15.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.032) 0:01:16.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.244 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.033) 0:01:16.278 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.308 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.034) 0:01:16.372 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.032) 0:01:16.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.436 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.036) 0:01:16.472 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.035) 0:01:16.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.032) 0:01:16.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.034) 0:01:16.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.033) 0:01:16.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.032) 0:01:16.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.033) 0:01:16.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.799 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.031) 0:01:16.830 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.030) 0:01:16.861 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=471 changed=4 unreachable=0 failed=0 skipped=364 rescued=0 ignored=0 Wednesday 01 June 2022 16:25:23 +0000 (0:00:00.017) 0:01:16.878 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.35s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp7247_7fr/tests/tests_change_fs.yml:2 ---------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:25:24 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:25:26 +0000 (0:00:01.292) 0:00:01.314 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_fs_nvme_generated.yml *********************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_fs_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:25:26 +0000 (0:00:00.019) 0:00:01.334 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:25:26 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:25:28 +0000 (0:00:01.250) 0:00:01.273 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_fs_scsi_generated.yml *********************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_fs_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_scsi_generated.yml:3 Wednesday 01 June 2022 16:25:28 +0000 (0:00:00.017) 0:00:01.291 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_scsi_generated.yml:7 Wednesday 01 June 2022 16:25:29 +0000 (0:00:01.032) 0:00:02.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:2 Wednesday 01 June 2022 16:25:29 +0000 (0:00:00.026) 0:00:02.351 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:13 Wednesday 01 June 2022 16:25:29 +0000 (0:00:00.816) 0:00:03.167 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:25:29 +0000 (0:00:00.040) 0:00:03.208 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:25:30 +0000 (0:00:00.159) 0:00:03.368 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:25:30 +0000 (0:00:00.524) 0:00:03.892 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:25:30 +0000 (0:00:00.078) 0:00:03.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:25:30 +0000 (0:00:00.025) 0:00:03.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:25:30 +0000 (0:00:00.022) 0:00:04.018 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:25:30 +0000 (0:00:00.198) 0:00:04.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:25:31 +0000 (0:00:00.018) 0:00:04.235 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:25:32 +0000 (0:00:01.108) 0:00:05.343 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:25:32 +0000 (0:00:00.045) 0:00:05.389 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:25:32 +0000 (0:00:00.045) 0:00:05.434 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:25:32 +0000 (0:00:00.695) 0:00:06.130 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:25:32 +0000 (0:00:00.080) 0:00:06.211 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:25:33 +0000 (0:00:00.020) 0:00:06.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:25:33 +0000 (0:00:00.022) 0:00:06.253 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:25:33 +0000 (0:00:00.021) 0:00:06.275 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:25:33 +0000 (0:00:00.860) 0:00:07.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:25:35 +0000 (0:00:01.804) 0:00:08.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:25:35 +0000 (0:00:00.042) 0:00:08.982 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:25:35 +0000 (0:00:00.026) 0:00:09.009 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.521) 0:00:09.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.029) 0:00:09.560 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.027) 0:00:09.588 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.031) 0:00:09.619 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.032) 0:00:09.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.031) 0:00:09.683 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.030) 0:00:09.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.028) 0:00:09.742 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.027) 0:00:09.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:25:36 +0000 (0:00:00.027) 0:00:09.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:25:37 +0000 (0:00:00.468) 0:00:10.265 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:25:37 +0000 (0:00:00.028) 0:00:10.294 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:16 Wednesday 01 June 2022 16:25:37 +0000 (0:00:00.847) 0:00:11.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:23 Wednesday 01 June 2022 16:25:37 +0000 (0:00:00.029) 0:00:11.172 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:25:37 +0000 (0:00:00.046) 0:00:11.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:25:38 +0000 (0:00:00.512) 0:00:11.730 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:25:38 +0000 (0:00:00.036) 0:00:11.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:25:38 +0000 (0:00:00.028) 0:00:11.796 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a LVM logical volume with default fs_type] ************************ task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:28 Wednesday 01 June 2022 16:25:38 +0000 (0:00:00.031) 0:00:11.827 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:25:38 +0000 (0:00:00.054) 0:00:11.881 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:25:38 +0000 (0:00:00.043) 0:00:11.925 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.504) 0:00:12.429 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.073) 0:00:12.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.029) 0:00:12.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.029) 0:00:12.562 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.059) 0:00:12.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.025) 0:00:12.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.029) 0:00:12.677 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.068) 0:00:12.745 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.031) 0:00:12.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.028) 0:00:12.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.029) 0:00:12.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.028) 0:00:12.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.028) 0:00:12.893 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.044) 0:00:12.937 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:25:39 +0000 (0:00:00.027) 0:00:12.965 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:25:41 +0000 (0:00:01.749) 0:00:14.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:25:41 +0000 (0:00:00.030) 0:00:14.744 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:25:41 +0000 (0:00:00.027) 0:00:14.772 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:25:41 +0000 (0:00:00.041) 0:00:14.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:25:41 +0000 (0:00:00.046) 0:00:14.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:25:41 +0000 (0:00:00.038) 0:00:14.898 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:25:41 +0000 (0:00:00.031) 0:00:14.930 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:25:42 +0000 (0:00:00.912) 0:00:15.842 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:25:43 +0000 (0:00:00.549) 0:00:16.392 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:25:43 +0000 (0:00:00.652) 0:00:17.045 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:25:44 +0000 (0:00:00.366) 0:00:17.411 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:25:44 +0000 (0:00:00.027) 0:00:17.439 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:40 Wednesday 01 June 2022 16:25:45 +0000 (0:00:00.842) 0:00:18.281 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:25:45 +0000 (0:00:00.054) 0:00:18.336 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:25:45 +0000 (0:00:00.078) 0:00:18.414 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:25:45 +0000 (0:00:00.030) 0:00:18.444 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "W6g1Mb-7gF4-KLph-5iAg-OEaj-YZfV-itBUSk" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:25:45 +0000 (0:00:00.471) 0:00:18.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002543", "end": "2022-06-01 12:25:45.597477", "rc": 0, "start": "2022-06-01 12:25:45.594934" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:25:46 +0000 (0:00:00.455) 0:00:19.371 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002751", "end": "2022-06-01 12:25:45.959785", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:25:45.957034" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:25:46 +0000 (0:00:00.362) 0:00:19.733 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:25:46 +0000 (0:00:00.064) 0:00:19.798 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:25:46 +0000 (0:00:00.030) 0:00:19.829 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:25:46 +0000 (0:00:00.066) 0:00:19.895 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:25:46 +0000 (0:00:00.038) 0:00:19.934 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.469) 0:00:20.404 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.041) 0:00:20.446 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.038) 0:00:20.484 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.034) 0:00:20.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.035) 0:00:20.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.029) 0:00:20.583 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.044) 0:00:20.627 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.055) 0:00:20.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.029) 0:00:20.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.029) 0:00:20.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.031) 0:00:20.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.030) 0:00:20.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.029) 0:00:20.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.029) 0:00:20.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.031) 0:00:20.895 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.030) 0:00:20.926 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.059) 0:00:20.985 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.059) 0:00:21.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.029) 0:00:21.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.036) 0:00:21.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.032) 0:00:21.144 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:25:47 +0000 (0:00:00.060) 0:00:21.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.034) 0:00:21.239 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.082) 0:00:21.321 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.059) 0:00:21.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.035) 0:00:21.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.036) 0:00:21.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.031) 0:00:21.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.545 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.606 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.063) 0:00:21.670 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.062) 0:00:21.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.029) 0:00:21.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.031) 0:00:21.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.029) 0:00:21.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.029) 0:00:21.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.943 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.030) 0:00:21.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.032) 0:00:22.007 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.061) 0:00:22.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:25:48 +0000 (0:00:00.035) 0:00:22.103 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.132) 0:00:22.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.035) 0:00:22.271 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.040) 0:00:22.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.036) 0:00:22.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.033) 0:00:22.382 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.035) 0:00:22.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.031) 0:00:22.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.029) 0:00:22.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.028) 0:00:22.507 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.030) 0:00:22.537 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.045) 0:00:22.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.037) 0:00:22.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.038) 0:00:22.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.030) 0:00:22.690 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.029) 0:00:22.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.035) 0:00:22.756 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.036) 0:00:22.792 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100740.8821216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100740.8821216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1420, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100740.8821216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.381) 0:00:23.174 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:25:49 +0000 (0:00:00.045) 0:00:23.219 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.038) 0:00:23.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.036) 0:00:23.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.031) 0:00:23.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.035) 0:00:23.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.029) 0:00:23.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.029) 0:00:23.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.029) 0:00:23.450 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.082) 0:00:23.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.030) 0:00:23.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.030) 0:00:23.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.030) 0:00:23.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.030) 0:00:23.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.030) 0:00:23.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.040) 0:00:23.726 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.037) 0:00:23.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.031) 0:00:23.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.031) 0:00:23.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.030) 0:00:23.857 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.032) 0:00:23.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.034) 0:00:23.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.031) 0:00:23.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.032) 0:00:23.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.029) 0:00:24.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.028) 0:00:24.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.028) 0:00:24.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:25:50 +0000 (0:00:00.030) 0:00:24.106 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.470) 0:00:24.576 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.376) 0:00:24.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.037) 0:00:24.991 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.033) 0:00:25.025 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.030) 0:00:25.056 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.029) 0:00:25.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.030) 0:00:25.117 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.032) 0:00:25.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.031) 0:00:25.180 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:25:51 +0000 (0:00:00.035) 0:00:25.216 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.034) 0:00:25.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.039) 0:00:25.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.034913", "end": "2022-06-01 12:25:51.924481", "rc": 0, "start": "2022-06-01 12:25:51.889568" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.407) 0:00:25.697 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.038) 0:00:25.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.048) 0:00:25.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.033) 0:00:25.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.033) 0:00:25.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.033) 0:00:25.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.034) 0:00:25.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.031) 0:00:25.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.032) 0:00:25.984 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.036) 0:00:26.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the file system signature on the logical volume created above] **** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:42 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.034) 0:00:26.055 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.077) 0:00:26.133 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:25:52 +0000 (0:00:00.044) 0:00:26.177 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.528) 0:00:26.706 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.072) 0:00:26.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.030) 0:00:26.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.030) 0:00:26.839 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.061) 0:00:26.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.025) 0:00:26.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.031) 0:00:26.957 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "fs_type": "xfs", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.037) 0:00:26.995 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.031) 0:00:27.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.029) 0:00:27.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.028) 0:00:27.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.029) 0:00:27.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.030) 0:00:27.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.043) 0:00:27.188 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:25:53 +0000 (0:00:00.028) 0:00:27.217 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:25:55 +0000 (0:00:01.334) 0:00:28.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:25:55 +0000 (0:00:00.030) 0:00:28.582 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:25:55 +0000 (0:00:00.027) 0:00:28.609 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:25:55 +0000 (0:00:00.041) 0:00:28.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:25:55 +0000 (0:00:00.037) 0:00:28.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:25:55 +0000 (0:00:00.034) 0:00:28.723 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:25:55 +0000 (0:00:00.028) 0:00:28.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:25:56 +0000 (0:00:00.642) 0:00:29.394 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:25:56 +0000 (0:00:00.371) 0:00:29.766 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:25:57 +0000 (0:00:00.651) 0:00:30.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:25:57 +0000 (0:00:00.364) 0:00:30.782 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:25:57 +0000 (0:00:00.030) 0:00:30.812 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:55 Wednesday 01 June 2022 16:25:58 +0000 (0:00:00.912) 0:00:31.724 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:25:58 +0000 (0:00:00.087) 0:00:31.811 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:25:58 +0000 (0:00:00.039) 0:00:31.851 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:25:58 +0000 (0:00:00.030) 0:00:31.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "W6g1Mb-7gF4-KLph-5iAg-OEaj-YZfV-itBUSk" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:25:59 +0000 (0:00:00.403) 0:00:32.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002505", "end": "2022-06-01 12:25:58.884429", "rc": 0, "start": "2022-06-01 12:25:58.881924" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:25:59 +0000 (0:00:00.386) 0:00:32.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002855", "end": "2022-06-01 12:25:59.273374", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:25:59.270519" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:25:59 +0000 (0:00:00.378) 0:00:33.050 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:25:59 +0000 (0:00:00.068) 0:00:33.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:25:59 +0000 (0:00:00.033) 0:00:33.152 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:25:59 +0000 (0:00:00.064) 0:00:33.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.042) 0:00:33.259 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.389) 0:00:33.648 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.042) 0:00:33.690 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.037) 0:00:33.728 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.038) 0:00:33.766 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.037) 0:00:33.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.033) 0:00:33.837 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.043) 0:00:33.881 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.060) 0:00:33.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.030) 0:00:33.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.029) 0:00:34.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.030) 0:00:34.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.029) 0:00:34.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.027) 0:00:34.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.029) 0:00:34.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.030) 0:00:34.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:26:00 +0000 (0:00:00.030) 0:00:34.182 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.059) 0:00:34.242 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.059) 0:00:34.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.030) 0:00:34.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.031) 0:00:34.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.031) 0:00:34.395 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.061) 0:00:34.457 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.035) 0:00:34.492 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.033) 0:00:34.525 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.100) 0:00:34.626 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.035) 0:00:34.661 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.035) 0:00:34.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.029) 0:00:34.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.029) 0:00:34.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.031) 0:00:34.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.029) 0:00:34.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.030) 0:00:34.848 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.062) 0:00:34.910 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.071) 0:00:34.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.036) 0:00:35.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.035) 0:00:35.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.031) 0:00:35.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.033) 0:00:35.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.031) 0:00:35.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.031) 0:00:35.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:26:01 +0000 (0:00:00.031) 0:00:35.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.032) 0:00:35.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.031) 0:00:35.277 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.067) 0:00:35.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.035) 0:00:35.380 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.123) 0:00:35.504 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.035) 0:00:35.539 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.044) 0:00:35.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.036) 0:00:35.620 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.035) 0:00:35.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.036) 0:00:35.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.030) 0:00:35.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.032) 0:00:35.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.036) 0:00:35.792 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.035) 0:00:35.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.048) 0:00:35.876 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.036) 0:00:35.912 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.037) 0:00:35.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.030) 0:00:35.981 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.033) 0:00:36.015 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.036) 0:00:36.051 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:26:02 +0000 (0:00:00.036) 0:00:36.088 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100740.8821216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100740.8821216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1420, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100740.8821216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.392) 0:00:36.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.037) 0:00:36.518 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.035) 0:00:36.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.032) 0:00:36.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.030) 0:00:36.616 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.035) 0:00:36.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.029) 0:00:36.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.029) 0:00:36.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.029) 0:00:36.740 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.040) 0:00:36.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.084) 0:00:36.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.031) 0:00:36.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.030) 0:00:36.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.030) 0:00:36.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.029) 0:00:36.988 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.036) 0:00:37.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.036) 0:00:37.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.031) 0:00:37.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.028) 0:00:37.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.029) 0:00:37.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.033) 0:00:37.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:26:03 +0000 (0:00:00.031) 0:00:37.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.034) 0:00:37.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.030) 0:00:37.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.029) 0:00:37.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.029) 0:00:37.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.030) 0:00:37.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.030) 0:00:37.401 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.377) 0:00:37.779 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.374) 0:00:38.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.037) 0:00:38.191 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:26:04 +0000 (0:00:00.036) 0:00:38.228 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.033) 0:00:38.261 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.035) 0:00:38.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.034) 0:00:38.331 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.032) 0:00:38.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.032) 0:00:38.396 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.041) 0:00:38.437 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.033) 0:00:38.471 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.041) 0:00:38.512 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.039614", "end": "2022-06-01 12:26:05.143815", "rc": 0, "start": "2022-06-01 12:26:05.104201" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.409) 0:00:38.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.039) 0:00:38.961 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.040) 0:00:39.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.034) 0:00:39.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.035) 0:00:39.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.033) 0:00:39.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.032) 0:00:39.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.030) 0:00:39.168 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.029) 0:00:39.198 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:26:05 +0000 (0:00:00.027) 0:00:39.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Re-run the role on the same volume without specifying fs_type] *********** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:57 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.033) 0:00:39.258 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.070) 0:00:39.329 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.045) 0:00:39.374 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.585) 0:00:39.960 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.073) 0:00:40.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.031) 0:00:40.065 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.031) 0:00:40.097 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.063) 0:00:40.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.025) 0:00:40.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:26:06 +0000 (0:00:00.029) 0:00:40.216 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.036) 0:00:40.253 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.034) 0:00:40.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.030) 0:00:40.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.030) 0:00:40.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.030) 0:00:40.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.029) 0:00:40.407 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.043) 0:00:40.450 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:26:07 +0000 (0:00:00.029) 0:00:40.480 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:26:08 +0000 (0:00:01.330) 0:00:41.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:26:08 +0000 (0:00:00.031) 0:00:41.842 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:26:08 +0000 (0:00:00.028) 0:00:41.871 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:26:08 +0000 (0:00:00.038) 0:00:41.910 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:26:08 +0000 (0:00:00.037) 0:00:41.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:26:08 +0000 (0:00:00.035) 0:00:41.983 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:26:08 +0000 (0:00:00.029) 0:00:42.012 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:26:09 +0000 (0:00:00.665) 0:00:42.678 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:26:09 +0000 (0:00:00.381) 0:00:43.060 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:26:10 +0000 (0:00:00.691) 0:00:43.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:26:10 +0000 (0:00:00.368) 0:00:44.119 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:26:10 +0000 (0:00:00.029) 0:00:44.149 ******** ok: [/cache/rhel-x.qcow2] TASK [Verify the output of the duplicate volumes test] ************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:69 Wednesday 01 June 2022 16:26:11 +0000 (0:00:00.924) 0:00:45.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:74 Wednesday 01 June 2022 16:26:11 +0000 (0:00:00.073) 0:00:45.147 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:26:11 +0000 (0:00:00.071) 0:00:45.219 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:26:12 +0000 (0:00:00.043) 0:00:45.263 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:26:12 +0000 (0:00:00.033) 0:00:45.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "W6g1Mb-7gF4-KLph-5iAg-OEaj-YZfV-itBUSk" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:26:12 +0000 (0:00:00.369) 0:00:45.666 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002695", "end": "2022-06-01 12:26:12.275569", "rc": 0, "start": "2022-06-01 12:26:12.272874" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:26:12 +0000 (0:00:00.391) 0:00:46.057 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002542", "end": "2022-06-01 12:26:12.659246", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:26:12.656704" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.376) 0:00:46.434 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.067) 0:00:46.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.031) 0:00:46.533 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.065) 0:00:46.599 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.040) 0:00:46.639 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.379) 0:00:47.018 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.042) 0:00:47.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.038) 0:00:47.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.035) 0:00:47.135 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.036) 0:00:47.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:26:13 +0000 (0:00:00.031) 0:00:47.203 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.043) 0:00:47.246 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.053) 0:00:47.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:47.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.029) 0:00:47.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.032) 0:00:47.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:47.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:47.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:47.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:47.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.031) 0:00:47.546 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.058) 0:00:47.605 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.059) 0:00:47.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:47.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.031) 0:00:47.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.029) 0:00:47.757 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.060) 0:00:47.817 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.033) 0:00:47.851 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.107) 0:00:47.959 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.059) 0:00:48.018 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.035) 0:00:48.053 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.035) 0:00:48.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.029) 0:00:48.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:48.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:48.179 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:26:14 +0000 (0:00:00.030) 0:00:48.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.030) 0:00:48.241 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.064) 0:00:48.305 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.065) 0:00:48.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.030) 0:00:48.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.031) 0:00:48.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.031) 0:00:48.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.029) 0:00:48.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.031) 0:00:48.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.030) 0:00:48.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.030) 0:00:48.587 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.030) 0:00:48.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.032) 0:00:48.649 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.058) 0:00:48.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.033) 0:00:48.742 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.123) 0:00:48.866 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.037) 0:00:48.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.041) 0:00:48.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.035) 0:00:48.980 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.034) 0:00:49.015 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.036) 0:00:49.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.032) 0:00:49.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.033) 0:00:49.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.030) 0:00:49.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.031) 0:00:49.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:26:15 +0000 (0:00:00.046) 0:00:49.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.035) 0:00:49.260 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.035) 0:00:49.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.031) 0:00:49.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.031) 0:00:49.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.035) 0:00:49.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.035) 0:00:49.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100740.8821216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100740.8821216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1420, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100740.8821216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.380) 0:00:49.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.037) 0:00:49.846 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.039) 0:00:49.886 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.036) 0:00:49.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.030) 0:00:49.953 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.034) 0:00:49.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.030) 0:00:50.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.030) 0:00:50.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.029) 0:00:50.078 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.039) 0:00:50.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.068) 0:00:50.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:26:16 +0000 (0:00:00.031) 0:00:50.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.031) 0:00:50.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.030) 0:00:50.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.030) 0:00:50.310 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.036) 0:00:50.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.036) 0:00:50.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.029) 0:00:50.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.028) 0:00:50.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.029) 0:00:50.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.029) 0:00:50.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.029) 0:00:50.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.032) 0:00:50.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.030) 0:00:50.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.030) 0:00:50.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.033) 0:00:50.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.034) 0:00:50.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.030) 0:00:50.721 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:26:17 +0000 (0:00:00.381) 0:00:51.103 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.371) 0:00:51.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.038) 0:00:51.514 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.034) 0:00:51.548 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.030) 0:00:51.579 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.031) 0:00:51.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.030) 0:00:51.642 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.030) 0:00:51.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.030) 0:00:51.703 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.036) 0:00:51.740 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.033) 0:00:51.774 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:26:18 +0000 (0:00:00.040) 0:00:51.814 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036769", "end": "2022-06-01 12:26:18.466099", "rc": 0, "start": "2022-06-01 12:26:18.429330" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.432) 0:00:52.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.037) 0:00:52.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.037) 0:00:52.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.032) 0:00:52.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.035) 0:00:52.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.034) 0:00:52.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.032) 0:00:52.457 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.031) 0:00:52.488 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.029) 0:00:52.518 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.030) 0:00:52.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:76 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.030) 0:00:52.578 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.075) 0:00:52.654 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.047) 0:00:52.701 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:26:19 +0000 (0:00:00.507) 0:00:53.209 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.072) 0:00:53.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.032) 0:00:53.314 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.031) 0:00:53.345 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.063) 0:00:53.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.025) 0:00:53.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.029) 0:00:53.464 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "fs_type": "xfs", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.038) 0:00:53.502 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.034) 0:00:53.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.030) 0:00:53.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.030) 0:00:53.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.030) 0:00:53.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.030) 0:00:53.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.043) 0:00:53.702 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:26:20 +0000 (0:00:00.031) 0:00:53.733 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:26:21 +0000 (0:00:01.331) 0:00:55.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:26:21 +0000 (0:00:00.030) 0:00:55.095 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:26:21 +0000 (0:00:00.029) 0:00:55.124 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:26:21 +0000 (0:00:00.039) 0:00:55.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:26:21 +0000 (0:00:00.037) 0:00:55.202 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:26:22 +0000 (0:00:00.037) 0:00:55.240 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:26:22 +0000 (0:00:00.029) 0:00:55.269 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:26:22 +0000 (0:00:00.670) 0:00:55.940 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:26:23 +0000 (0:00:00.399) 0:00:56.339 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:26:23 +0000 (0:00:00.658) 0:00:56.997 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:26:24 +0000 (0:00:00.377) 0:00:57.374 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:26:24 +0000 (0:00:00.030) 0:00:57.405 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:89 Wednesday 01 June 2022 16:26:25 +0000 (0:00:00.874) 0:00:58.280 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:26:25 +0000 (0:00:00.118) 0:00:58.398 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:26:25 +0000 (0:00:00.042) 0:00:58.441 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:26:25 +0000 (0:00:00.029) 0:00:58.470 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "W6g1Mb-7gF4-KLph-5iAg-OEaj-YZfV-itBUSk" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:26:25 +0000 (0:00:00.365) 0:00:58.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002736", "end": "2022-06-01 12:26:25.429083", "rc": 0, "start": "2022-06-01 12:26:25.426347" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:26:25 +0000 (0:00:00.369) 0:00:59.205 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002380", "end": "2022-06-01 12:26:25.792507", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:26:25.790127" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:26:26 +0000 (0:00:00.361) 0:00:59.567 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:26:26 +0000 (0:00:00.064) 0:00:59.631 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:26:26 +0000 (0:00:00.031) 0:00:59.663 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:26:26 +0000 (0:00:00.065) 0:00:59.728 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:26:26 +0000 (0:00:00.040) 0:00:59.769 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:26:26 +0000 (0:00:00.388) 0:01:00.157 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:26:26 +0000 (0:00:00.042) 0:01:00.200 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.038) 0:01:00.238 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.034) 0:01:00.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.035) 0:01:00.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.031) 0:01:00.340 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.040) 0:01:00.381 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.058) 0:01:00.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.031) 0:01:00.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.030) 0:01:00.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.030) 0:01:00.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.038) 0:01:00.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.036) 0:01:00.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.029) 0:01:00.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.028) 0:01:00.666 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.029) 0:01:00.696 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.060) 0:01:00.756 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.059) 0:01:00.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.030) 0:01:00.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.029) 0:01:00.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.032) 0:01:00.910 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.062) 0:01:00.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.036) 0:01:01.008 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.034) 0:01:01.043 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.112) 0:01:01.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.037) 0:01:01.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:26:27 +0000 (0:00:00.036) 0:01:01.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.030) 0:01:01.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.028) 0:01:01.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.027) 0:01:01.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.031) 0:01:01.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.028) 0:01:01.376 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.063) 0:01:01.439 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.067) 0:01:01.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.032) 0:01:01.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.032) 0:01:01.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.029) 0:01:01.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.028) 0:01:01.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.033) 0:01:01.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.032) 0:01:01.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.030) 0:01:01.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.029) 0:01:01.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.027) 0:01:01.783 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.062) 0:01:01.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.035) 0:01:01.881 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.119) 0:01:02.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.034) 0:01:02.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "e1b833c4-6052-4953-8ad8-75695541ef5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.042) 0:01:02.078 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.040) 0:01:02.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.036) 0:01:02.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.038) 0:01:02.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:26:28 +0000 (0:00:00.031) 0:01:02.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.031) 0:01:02.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.030) 0:01:02.288 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.034) 0:01:02.322 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.047) 0:01:02.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.035) 0:01:02.405 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.035) 0:01:02.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.029) 0:01:02.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.030) 0:01:02.499 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.037) 0:01:02.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.035) 0:01:02.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100740.8821216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100740.8821216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1420, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100740.8821216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.383) 0:01:02.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.037) 0:01:02.993 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.035) 0:01:03.029 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.034) 0:01:03.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.032) 0:01:03.096 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.038) 0:01:03.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.032) 0:01:03.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:26:29 +0000 (0:00:00.033) 0:01:03.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.031) 0:01:03.231 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.038) 0:01:03.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.031) 0:01:03.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.072) 0:01:03.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.032) 0:01:03.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.031) 0:01:03.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.030) 0:01:03.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.037) 0:01:03.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.036) 0:01:03.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.033) 0:01:03.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.030) 0:01:03.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.031) 0:01:03.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.031) 0:01:03.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.029) 0:01:03.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.029) 0:01:03.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.038) 0:01:03.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.036) 0:01:03.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.033) 0:01:03.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.032) 0:01:03.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:26:30 +0000 (0:00:00.035) 0:01:03.905 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.376) 0:01:04.281 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.368) 0:01:04.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.038) 0:01:04.688 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.035) 0:01:04.724 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.032) 0:01:04.757 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.032) 0:01:04.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.031) 0:01:04.822 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.031) 0:01:04.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.031) 0:01:04.885 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.034) 0:01:04.920 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.038) 0:01:04.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:26:31 +0000 (0:00:00.039) 0:01:04.998 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.034411", "end": "2022-06-01 12:26:31.626711", "rc": 0, "start": "2022-06-01 12:26:31.592300" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.405) 0:01:05.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.036) 0:01:05.440 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.037) 0:01:05.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.033) 0:01:05.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.032) 0:01:05.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.031) 0:01:05.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.032) 0:01:05.608 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.031) 0:01:05.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.029) 0:01:05.668 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.028) 0:01:05.696 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:91 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.030) 0:01:05.727 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.105) 0:01:05.833 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:26:32 +0000 (0:00:00.048) 0:01:05.881 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.552) 0:01:06.434 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.072) 0:01:06.507 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.031) 0:01:06.538 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.030) 0:01:06.569 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.063) 0:01:06.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.026) 0:01:06.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.030) 0:01:06.690 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "fs_type": "xfs", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.038) 0:01:06.728 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.033) 0:01:06.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.031) 0:01:06.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.032) 0:01:06.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.030) 0:01:06.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.030) 0:01:06.886 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.044) 0:01:06.930 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:26:33 +0000 (0:00:00.030) 0:01:06.961 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:26:35 +0000 (0:00:01.921) 0:01:08.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:26:35 +0000 (0:00:00.031) 0:01:08.915 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:26:35 +0000 (0:00:00.028) 0:01:08.943 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:26:35 +0000 (0:00:00.040) 0:01:08.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:26:35 +0000 (0:00:00.036) 0:01:09.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:26:35 +0000 (0:00:00.034) 0:01:09.055 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:26:36 +0000 (0:00:00.395) 0:01:09.450 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:26:36 +0000 (0:00:00.672) 0:01:10.123 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:26:36 +0000 (0:00:00.034) 0:01:10.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:26:37 +0000 (0:00:00.627) 0:01:10.785 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:26:37 +0000 (0:00:00.376) 0:01:11.161 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:26:37 +0000 (0:00:00.032) 0:01:11.194 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs.yml:105 Wednesday 01 June 2022 16:26:38 +0000 (0:00:00.865) 0:01:12.059 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:26:38 +0000 (0:00:00.118) 0:01:12.177 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:26:38 +0000 (0:00:00.041) 0:01:12.219 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:26:39 +0000 (0:00:00.030) 0:01:12.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:26:39 +0000 (0:00:00.366) 0:01:12.616 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002768", "end": "2022-06-01 12:26:39.211238", "rc": 0, "start": "2022-06-01 12:26:39.208470" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:26:39 +0000 (0:00:00.382) 0:01:12.999 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003060", "end": "2022-06-01 12:26:39.598346", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:26:39.595286" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.374) 0:01:13.374 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.064) 0:01:13.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.031) 0:01:13.470 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.060) 0:01:13.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.038) 0:01:13.568 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.027) 0:01:13.595 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.029) 0:01:13.625 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.036) 0:01:13.662 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.035) 0:01:13.697 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.034) 0:01:13.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.031) 0:01:13.763 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.027) 0:01:13.790 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.058) 0:01:13.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.031) 0:01:13.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.031) 0:01:13.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.029) 0:01:13.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.029) 0:01:13.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.031) 0:01:14.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.029) 0:01:14.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.029) 0:01:14.062 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.030) 0:01:14.092 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.058) 0:01:14.151 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:26:40 +0000 (0:00:00.057) 0:01:14.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.029) 0:01:14.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.029) 0:01:14.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.029) 0:01:14.297 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.062) 0:01:14.359 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.034) 0:01:14.394 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.027) 0:01:14.422 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.026) 0:01:14.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.030) 0:01:14.478 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.062) 0:01:14.541 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.062) 0:01:14.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.029) 0:01:14.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.068) 0:01:14.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.032) 0:01:14.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.032) 0:01:14.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.031) 0:01:14.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.029) 0:01:14.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.030) 0:01:14.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.034) 0:01:14.892 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.031) 0:01:14.924 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.058) 0:01:14.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.036) 0:01:15.018 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.122) 0:01:15.140 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.035) 0:01:15.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:26:41 +0000 (0:00:00.041) 0:01:15.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.029) 0:01:15.247 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.034) 0:01:15.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.029) 0:01:15.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.028) 0:01:15.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.028) 0:01:15.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.031) 0:01:15.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.030) 0:01:15.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.044) 0:01:15.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.025) 0:01:15.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.034) 0:01:15.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.029) 0:01:15.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.033) 0:01:15.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.029) 0:01:15.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.025) 0:01:15.654 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.384) 0:01:16.039 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.037) 0:01:16.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.035) 0:01:16.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.044) 0:01:16.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.035) 0:01:16.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:26:42 +0000 (0:00:00.026) 0:01:16.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.026) 0:01:16.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.031) 0:01:16.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.029) 0:01:16.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.029) 0:01:16.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.029) 0:01:16.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.037) 0:01:16.527 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.037) 0:01:16.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.029) 0:01:16.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.029) 0:01:16.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.028) 0:01:16.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.031) 0:01:16.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.032) 0:01:16.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:16.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.029) 0:01:16.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.031) 0:01:16.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.082) 0:01:17.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.032) 0:01:17.045 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.033) 0:01:17.078 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:17.109 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.030) 0:01:17.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.032) 0:01:17.172 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:26:43 +0000 (0:00:00.035) 0:01:17.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.031) 0:01:17.238 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.033) 0:01:17.272 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.033) 0:01:17.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.030) 0:01:17.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.030) 0:01:17.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.031) 0:01:17.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.030) 0:01:17.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.028) 0:01:17.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.029) 0:01:17.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.030) 0:01:17.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.029) 0:01:17.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.034) 0:01:17.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.028) 0:01:17.611 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.027) 0:01:17.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=473 changed=4 unreachable=0 failed=0 skipped=364 rescued=0 ignored=0 Wednesday 01 June 2022 16:26:44 +0000 (0:00:00.015) 0:01:17.654 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.33s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.33s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.33s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.03s /tmp/tmp7247_7fr/tests/tests_change_fs_scsi_generated.yml:3 ------------------- linux-system-roles.storage : Update facts ------------------------------- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp7247_7fr/tests/tests_change_fs.yml:2 ---------------------------------- linux-system-roles.storage : get required packages ---------------------- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:26:45 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:26:46 +0000 (0:00:01.277) 0:00:01.301 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_fs_use_partitions.yml *********************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:2 Wednesday 01 June 2022 16:26:46 +0000 (0:00:00.015) 0:00:01.316 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:14 Wednesday 01 June 2022 16:26:47 +0000 (0:00:01.092) 0:00:02.409 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:26:47 +0000 (0:00:00.039) 0:00:02.448 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:26:47 +0000 (0:00:00.156) 0:00:02.605 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:26:48 +0000 (0:00:00.534) 0:00:03.139 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:26:48 +0000 (0:00:00.073) 0:00:03.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:26:48 +0000 (0:00:00.023) 0:00:03.237 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:26:48 +0000 (0:00:00.022) 0:00:03.259 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:26:48 +0000 (0:00:00.197) 0:00:03.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:26:48 +0000 (0:00:00.021) 0:00:03.478 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:26:49 +0000 (0:00:01.088) 0:00:04.566 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:26:49 +0000 (0:00:00.047) 0:00:04.614 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:26:49 +0000 (0:00:00.043) 0:00:04.658 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:26:50 +0000 (0:00:00.688) 0:00:05.347 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:26:50 +0000 (0:00:00.082) 0:00:05.430 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:26:50 +0000 (0:00:00.020) 0:00:05.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:26:50 +0000 (0:00:00.022) 0:00:05.473 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:26:50 +0000 (0:00:00.021) 0:00:05.495 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:26:51 +0000 (0:00:00.889) 0:00:06.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:26:53 +0000 (0:00:01.860) 0:00:08.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:26:53 +0000 (0:00:00.043) 0:00:08.287 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:26:53 +0000 (0:00:00.066) 0:00:08.353 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.518) 0:00:08.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.030) 0:00:08.903 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.027) 0:00:08.930 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.032) 0:00:08.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.032) 0:00:08.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.031) 0:00:09.027 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.027) 0:00:09.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.027) 0:00:09.081 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.027) 0:00:09.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.028) 0:00:09.137 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.467) 0:00:09.605 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:26:54 +0000 (0:00:00.028) 0:00:09.633 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:17 Wednesday 01 June 2022 16:26:55 +0000 (0:00:00.854) 0:00:10.488 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:24 Wednesday 01 June 2022 16:26:55 +0000 (0:00:00.030) 0:00:10.518 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:26:55 +0000 (0:00:00.044) 0:00:10.563 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:26:56 +0000 (0:00:00.524) 0:00:11.088 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:26:56 +0000 (0:00:00.032) 0:00:11.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:26:56 +0000 (0:00:00.029) 0:00:11.150 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create an LVM partition with the default file system type] *************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:29 Wednesday 01 June 2022 16:26:56 +0000 (0:00:00.031) 0:00:11.182 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:26:56 +0000 (0:00:00.056) 0:00:11.238 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:26:56 +0000 (0:00:00.043) 0:00:11.281 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:26:56 +0000 (0:00:00.519) 0:00:11.801 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.068) 0:00:11.869 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.030) 0:00:11.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.028) 0:00:11.928 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.059) 0:00:11.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.054) 0:00:12.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.028) 0:00:12.071 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.035) 0:00:12.106 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.032) 0:00:12.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.030) 0:00:12.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.027) 0:00:12.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.029) 0:00:12.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.028) 0:00:12.255 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.042) 0:00:12.297 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:26:57 +0000 (0:00:00.026) 0:00:12.323 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/bar", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/bar-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:27:00 +0000 (0:00:02.827) 0:00:15.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:27:00 +0000 (0:00:00.029) 0:00:15.181 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:27:00 +0000 (0:00:00.026) 0:00:15.207 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/bar", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/bar-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:27:00 +0000 (0:00:00.039) 0:00:15.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:27:00 +0000 (0:00:00.034) 0:00:15.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:27:00 +0000 (0:00:00.032) 0:00:15.314 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:27:00 +0000 (0:00:00.040) 0:00:15.355 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:27:01 +0000 (0:00:00.941) 0:00:16.297 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:27:02 +0000 (0:00:00.559) 0:00:16.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:27:02 +0000 (0:00:00.684) 0:00:17.541 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:27:03 +0000 (0:00:00.358) 0:00:17.900 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:27:03 +0000 (0:00:00.029) 0:00:17.930 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:41 Wednesday 01 June 2022 16:27:03 +0000 (0:00:00.821) 0:00:18.751 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:27:03 +0000 (0:00:00.055) 0:00:18.807 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:27:04 +0000 (0:00:00.083) 0:00:18.891 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:27:04 +0000 (0:00:00.030) 0:00:18.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/bar-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/bar-test1", "size": "5G", "type": "lvm", "uuid": "d4c3d3ef-e4e7-4588-ac03-77508ed22189" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "TbHukB-JlSV-frWJ-zpnu-23pw-9ozQ-uIrXsG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "PeZ5hq-ZFfQ-2Dz5-cKRW-QxSR-z9Tt-DDS2wr" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:27:04 +0000 (0:00:00.472) 0:00:19.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002671", "end": "2022-06-01 12:27:04.463246", "rc": 0, "start": "2022-06-01 12:27:04.460575" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/bar-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:27:05 +0000 (0:00:00.458) 0:00:19.853 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002774", "end": "2022-06-01 12:27:04.846861", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:27:04.844087" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:27:05 +0000 (0:00:00.380) 0:00:20.234 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:27:05 +0000 (0:00:00.062) 0:00:20.297 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:27:05 +0000 (0:00:00.029) 0:00:20.326 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:27:05 +0000 (0:00:00.063) 0:00:20.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:27:05 +0000 (0:00:00.038) 0:00:20.428 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.837) 0:00:21.266 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.050) 0:00:21.316 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.038) 0:00:21.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.036) 0:00:21.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.036) 0:00:21.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.029) 0:00:21.457 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.060) 0:00:21.518 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.056) 0:00:21.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.030) 0:00:21.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.030) 0:00:21.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.032) 0:00:21.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.039) 0:00:21.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.033) 0:00:21.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.031) 0:00:21.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.033) 0:00:21.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:27:06 +0000 (0:00:00.032) 0:00:21.839 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.061) 0:00:21.900 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.059) 0:00:21.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.030) 0:00:21.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.070) 0:00:22.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.031) 0:00:22.091 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.061) 0:00:22.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.034) 0:00:22.188 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.039) 0:00:22.228 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.065) 0:00:22.294 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.034) 0:00:22.329 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.034) 0:00:22.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.030) 0:00:22.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.032) 0:00:22.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.030) 0:00:22.457 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.031) 0:00:22.488 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.035) 0:00:22.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.034) 0:00:22.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.031) 0:00:22.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.029) 0:00:22.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.029) 0:00:22.649 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.030) 0:00:22.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.033) 0:00:22.713 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:27:07 +0000 (0:00:00.069) 0:00:22.782 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.065) 0:00:22.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.030) 0:00:22.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.031) 0:00:22.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.030) 0:00:22.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.030) 0:00:22.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.030) 0:00:23.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.029) 0:00:23.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.029) 0:00:23.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.040) 0:00:23.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.033) 0:00:23.132 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.062) 0:00:23.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.035) 0:00:23.229 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.126) 0:00:23.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.035) 0:00:23.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/bar-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "d4c3d3ef-e4e7-4588-ac03-77508ed22189" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/bar-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "d4c3d3ef-e4e7-4588-ac03-77508ed22189" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.042) 0:00:23.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.036) 0:00:23.471 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.034) 0:00:23.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.037) 0:00:23.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.029) 0:00:23.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.029) 0:00:23.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.029) 0:00:23.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.030) 0:00:23.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/bar-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.048) 0:00:23.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.037) 0:00:23.749 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.036) 0:00:23.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:27:08 +0000 (0:00:00.031) 0:00:23.817 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.031) 0:00:23.848 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.038) 0:00:23.887 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.040) 0:00:23.928 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100819.6871216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100819.6871216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1767, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100819.6871216, "nlink": 1, "path": "/dev/mapper/bar-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.384) 0:00:24.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.039) 0:00:24.351 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.036) 0:00:24.388 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.036) 0:00:24.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.029) 0:00:24.454 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.034) 0:00:24.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.029) 0:00:24.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.031) 0:00:24.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.030) 0:00:24.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.037) 0:00:24.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.030) 0:00:24.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.030) 0:00:24.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.030) 0:00:24.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.031) 0:00:24.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.030) 0:00:24.769 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:27:09 +0000 (0:00:00.039) 0:00:24.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.037) 0:00:24.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.031) 0:00:24.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.030) 0:00:24.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.032) 0:00:24.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.031) 0:00:24.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.030) 0:00:25.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.031) 0:00:25.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.030) 0:00:25.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.030) 0:00:25.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.035) 0:00:25.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.031) 0:00:25.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.030) 0:00:25.192 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:27:10 +0000 (0:00:00.523) 0:00:25.716 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.392) 0:00:26.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.039) 0:00:26.148 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.034) 0:00:26.182 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.039) 0:00:26.221 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.038) 0:00:26.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.030) 0:00:26.291 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.032) 0:00:26.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.033) 0:00:26.357 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.036) 0:00:26.393 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.034) 0:00:26.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:27:11 +0000 (0:00:00.040) 0:00:26.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "bar/test1" ], "delta": "0:00:00.035105", "end": "2022-06-01 12:27:11.487838", "rc": 0, "start": "2022-06-01 12:27:11.452733" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.411) 0:00:26.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.036) 0:00:26.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.035) 0:00:26.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.030) 0:00:26.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.033) 0:00:27.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.077) 0:00:27.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.033) 0:00:27.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.029) 0:00:27.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.027) 0:00:27.183 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.028) 0:00:27.211 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the LVM partition file system type to "ext4"] ********************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:43 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.033) 0:00:27.244 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.064) 0:00:27.309 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:27:12 +0000 (0:00:00.043) 0:00:27.353 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.535) 0:00:27.889 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.077) 0:00:27.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.030) 0:00:27.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.028) 0:00:28.024 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.060) 0:00:28.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.025) 0:00:28.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.032) 0:00:28.143 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.045) 0:00:28.188 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.035) 0:00:28.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.033) 0:00:28.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.030) 0:00:28.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.031) 0:00:28.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.028) 0:00:28.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.041) 0:00:28.390 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:27:13 +0000 (0:00:00.026) 0:00:28.417 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:27:15 +0000 (0:00:02.179) 0:00:30.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:27:15 +0000 (0:00:00.032) 0:00:30.628 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:27:15 +0000 (0:00:00.027) 0:00:30.656 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:27:15 +0000 (0:00:00.041) 0:00:30.697 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:27:15 +0000 (0:00:00.037) 0:00:30.734 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:27:15 +0000 (0:00:00.035) 0:00:30.770 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:27:16 +0000 (0:00:00.383) 0:00:31.154 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:27:16 +0000 (0:00:00.676) 0:00:31.830 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:27:17 +0000 (0:00:00.395) 0:00:32.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:27:18 +0000 (0:00:00.668) 0:00:32.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:27:18 +0000 (0:00:00.358) 0:00:33.253 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:27:18 +0000 (0:00:00.030) 0:00:33.283 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:56 Wednesday 01 June 2022 16:27:19 +0000 (0:00:00.829) 0:00:34.113 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:27:19 +0000 (0:00:00.055) 0:00:34.169 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:27:19 +0000 (0:00:00.051) 0:00:34.220 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:27:19 +0000 (0:00:00.029) 0:00:34.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/bar-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/bar-test1", "size": "5G", "type": "lvm", "uuid": "0866635f-8a76-40a9-b69f-baed01c2ea8f" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "TbHukB-JlSV-frWJ-zpnu-23pw-9ozQ-uIrXsG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "PeZ5hq-ZFfQ-2Dz5-cKRW-QxSR-z9Tt-DDS2wr" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:27:19 +0000 (0:00:00.378) 0:00:34.629 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003013", "end": "2022-06-01 12:27:19.610502", "rc": 0, "start": "2022-06-01 12:27:19.607489" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/bar-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:27:20 +0000 (0:00:00.371) 0:00:35.000 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003093", "end": "2022-06-01 12:27:19.978762", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:27:19.975669" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:27:20 +0000 (0:00:00.382) 0:00:35.383 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:27:20 +0000 (0:00:00.101) 0:00:35.484 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:27:20 +0000 (0:00:00.060) 0:00:35.545 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:27:20 +0000 (0:00:00.113) 0:00:35.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:27:20 +0000 (0:00:00.068) 0:00:35.727 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.786) 0:00:36.514 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.048) 0:00:36.562 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.038) 0:00:36.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.038) 0:00:36.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.034) 0:00:36.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.029) 0:00:36.704 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.057) 0:00:36.761 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:27:21 +0000 (0:00:00.055) 0:00:36.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.030) 0:00:36.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.030) 0:00:36.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.028) 0:00:36.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.032) 0:00:36.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.029) 0:00:36.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.028) 0:00:36.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.028) 0:00:37.025 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.034) 0:00:37.059 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.058) 0:00:37.117 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.060) 0:00:37.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.031) 0:00:37.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.029) 0:00:37.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.030) 0:00:37.270 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.061) 0:00:37.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.041) 0:00:37.373 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.047) 0:00:37.421 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.085) 0:00:37.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.037) 0:00:37.543 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.039) 0:00:37.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.030) 0:00:37.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.029) 0:00:37.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.030) 0:00:37.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.033) 0:00:37.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.041) 0:00:37.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.039) 0:00:37.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:27:22 +0000 (0:00:00.030) 0:00:37.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.030) 0:00:37.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.029) 0:00:37.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.029) 0:00:37.907 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.033) 0:00:37.941 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.066) 0:00:38.007 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.064) 0:00:38.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.031) 0:00:38.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.030) 0:00:38.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.030) 0:00:38.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.029) 0:00:38.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.029) 0:00:38.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.031) 0:00:38.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.032) 0:00:38.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.030) 0:00:38.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.031) 0:00:38.348 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.057) 0:00:38.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.036) 0:00:38.443 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.158) 0:00:38.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.036) 0:00:38.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "0866635f-8a76-40a9-b69f-baed01c2ea8f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "0866635f-8a76-40a9-b69f-baed01c2ea8f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.042) 0:00:38.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.037) 0:00:38.717 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.034) 0:00:38.752 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.039) 0:00:38.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:27:23 +0000 (0:00:00.031) 0:00:38.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.031) 0:00:38.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.029) 0:00:38.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.030) 0:00:38.915 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/bar-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.045) 0:00:38.961 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.035) 0:00:38.996 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.034) 0:00:39.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.029) 0:00:39.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.030) 0:00:39.091 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.036) 0:00:39.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.036) 0:00:39.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100835.1411216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100835.1411216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1767, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100835.1411216, "nlink": 1, "path": "/dev/mapper/bar-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.376) 0:00:39.540 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.037) 0:00:39.577 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.034) 0:00:39.612 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.035) 0:00:39.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.031) 0:00:39.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.037) 0:00:39.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.033) 0:00:39.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.033) 0:00:39.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:27:24 +0000 (0:00:00.030) 0:00:39.815 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.039) 0:00:39.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:39.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:39.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:39.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.033) 0:00:39.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:40.012 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.046) 0:00:40.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.051) 0:00:40.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.032) 0:00:40.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.033) 0:00:40.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.030) 0:00:40.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:40.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.030) 0:00:40.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:40.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:40.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.034) 0:00:40.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:40.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.032) 0:00:40.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.031) 0:00:40.462 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:27:25 +0000 (0:00:00.370) 0:00:40.832 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.370) 0:00:41.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.038) 0:00:41.241 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.033) 0:00:41.274 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.029) 0:00:41.304 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.029) 0:00:41.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.032) 0:00:41.367 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.031) 0:00:41.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.032) 0:00:41.431 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.044) 0:00:41.475 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.035) 0:00:41.511 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:27:26 +0000 (0:00:00.039) 0:00:41.551 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "bar/test1" ], "delta": "0:00:00.035026", "end": "2022-06-01 12:27:26.577600", "rc": 0, "start": "2022-06-01 12:27:26.542574" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.420) 0:00:41.971 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.039) 0:00:42.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.038) 0:00:42.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.032) 0:00:42.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.031) 0:00:42.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.031) 0:00:42.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.030) 0:00:42.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.031) 0:00:42.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.029) 0:00:42.237 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.026) 0:00:42.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:58 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.029) 0:00:42.294 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.067) 0:00:42.361 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:27:27 +0000 (0:00:00.043) 0:00:42.404 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.533) 0:00:42.938 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.072) 0:00:43.011 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.031) 0:00:43.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.030) 0:00:43.073 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.064) 0:00:43.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.027) 0:00:43.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.030) 0:00:43.195 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.039) 0:00:43.234 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.033) 0:00:43.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.031) 0:00:43.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.032) 0:00:43.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.029) 0:00:43.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.029) 0:00:43.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.044) 0:00:43.436 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:27:28 +0000 (0:00:00.027) 0:00:43.463 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:27:30 +0000 (0:00:01.839) 0:00:45.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:27:30 +0000 (0:00:00.032) 0:00:45.335 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:27:30 +0000 (0:00:00.029) 0:00:45.365 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:27:30 +0000 (0:00:00.039) 0:00:45.404 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:27:30 +0000 (0:00:00.038) 0:00:45.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:27:30 +0000 (0:00:00.035) 0:00:45.479 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:27:30 +0000 (0:00:00.030) 0:00:45.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:27:31 +0000 (0:00:00.673) 0:00:46.183 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:27:31 +0000 (0:00:00.390) 0:00:46.574 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:27:32 +0000 (0:00:00.635) 0:00:47.209 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:27:32 +0000 (0:00:00.382) 0:00:47.592 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:27:32 +0000 (0:00:00.029) 0:00:47.622 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:71 Wednesday 01 June 2022 16:27:33 +0000 (0:00:00.842) 0:00:48.464 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:27:33 +0000 (0:00:00.057) 0:00:48.522 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:27:33 +0000 (0:00:00.040) 0:00:48.563 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:27:33 +0000 (0:00:00.030) 0:00:48.593 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/bar-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/bar-test1", "size": "5G", "type": "lvm", "uuid": "0866635f-8a76-40a9-b69f-baed01c2ea8f" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "TbHukB-JlSV-frWJ-zpnu-23pw-9ozQ-uIrXsG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "PeZ5hq-ZFfQ-2Dz5-cKRW-QxSR-z9Tt-DDS2wr" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:27:34 +0000 (0:00:00.377) 0:00:48.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002596", "end": "2022-06-01 12:27:33.953406", "rc": 0, "start": "2022-06-01 12:27:33.950810" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/bar-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:27:34 +0000 (0:00:00.379) 0:00:49.350 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002765", "end": "2022-06-01 12:27:34.327327", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:27:34.324562" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:27:34 +0000 (0:00:00.387) 0:00:49.738 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:27:34 +0000 (0:00:00.064) 0:00:49.802 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:27:34 +0000 (0:00:00.029) 0:00:49.832 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:27:35 +0000 (0:00:00.063) 0:00:49.895 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:27:35 +0000 (0:00:00.040) 0:00:49.936 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:27:35 +0000 (0:00:00.755) 0:00:50.692 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:27:35 +0000 (0:00:00.050) 0:00:50.743 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:27:35 +0000 (0:00:00.038) 0:00:50.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:27:35 +0000 (0:00:00.036) 0:00:50.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.035) 0:00:50.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:50.885 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.050) 0:00:50.935 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.055) 0:00:50.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:51.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.029) 0:00:51.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.032) 0:00:51.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:51.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:51.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.029) 0:00:51.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:51.209 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.032) 0:00:51.241 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.057) 0:00:51.299 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.063) 0:00:51.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:51.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:51.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.031) 0:00:51.457 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.065) 0:00:51.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.039) 0:00:51.562 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.040) 0:00:51.602 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.068) 0:00:51.671 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.036) 0:00:51.708 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.038) 0:00:51.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.030) 0:00:51.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.029) 0:00:51.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:27:36 +0000 (0:00:00.030) 0:00:51.837 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.031) 0:00:51.868 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.035) 0:00:51.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.036) 0:00:51.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.030) 0:00:51.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.030) 0:00:52.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.090 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.099) 0:00:52.190 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.063) 0:00:52.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.030) 0:00:52.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.032) 0:00:52.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.032) 0:00:52.498 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.029) 0:00:52.528 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.056) 0:00:52.584 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.036) 0:00:52.621 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.132) 0:00:52.753 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.037) 0:00:52.791 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "0866635f-8a76-40a9-b69f-baed01c2ea8f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "0866635f-8a76-40a9-b69f-baed01c2ea8f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:27:37 +0000 (0:00:00.046) 0:00:52.837 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.037) 0:00:52.875 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.034) 0:00:52.909 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.037) 0:00:52.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.030) 0:00:52.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.029) 0:00:53.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.029) 0:00:53.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.030) 0:00:53.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/bar-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.047) 0:00:53.114 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.033) 0:00:53.148 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.035) 0:00:53.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.028) 0:00:53.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.029) 0:00:53.242 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.035) 0:00:53.277 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.037) 0:00:53.315 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100835.1411216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100835.1411216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1767, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100835.1411216, "nlink": 1, "path": "/dev/mapper/bar-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.379) 0:00:53.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.039) 0:00:53.735 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.039) 0:00:53.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:27:38 +0000 (0:00:00.035) 0:00:53.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.030) 0:00:53.840 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.033) 0:00:53.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.030) 0:00:53.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.028) 0:00:53.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.027) 0:00:53.961 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.034) 0:00:53.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.027) 0:00:54.024 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.031) 0:00:54.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.032) 0:00:54.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.030) 0:00:54.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.030) 0:00:54.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.037) 0:00:54.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.035) 0:00:54.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.029) 0:00:54.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.032) 0:00:54.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.065) 0:00:54.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.031) 0:00:54.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.030) 0:00:54.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.030) 0:00:54.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.030) 0:00:54.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.029) 0:00:54.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.031) 0:00:54.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.029) 0:00:54.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:27:39 +0000 (0:00:00.028) 0:00:54.592 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.390) 0:00:54.982 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.375) 0:00:55.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.036) 0:00:55.394 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.035) 0:00:55.430 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.031) 0:00:55.462 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.031) 0:00:55.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.029) 0:00:55.524 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.031) 0:00:55.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.029) 0:00:55.585 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.031) 0:00:55.617 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.033) 0:00:55.651 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:27:40 +0000 (0:00:00.041) 0:00:55.692 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "bar/test1" ], "delta": "0:00:00.041686", "end": "2022-06-01 12:27:40.727700", "rc": 0, "start": "2022-06-01 12:27:40.686014" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.427) 0:00:56.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.038) 0:00:56.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.038) 0:00:56.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.032) 0:00:56.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.031) 0:00:56.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.031) 0:00:56.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.034) 0:00:56.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.030) 0:00:56.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.031) 0:00:56.389 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.027) 0:00:56.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:73 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.029) 0:00:56.446 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.082) 0:00:56.529 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:27:41 +0000 (0:00:00.044) 0:00:56.573 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.527) 0:00:57.101 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.071) 0:00:57.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.032) 0:00:57.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.030) 0:00:57.236 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.106) 0:00:57.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.032) 0:00:57.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.033) 0:00:57.409 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "state": "absent", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.040) 0:00:57.449 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.033) 0:00:57.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.034) 0:00:57.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.030) 0:00:57.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.030) 0:00:57.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.030) 0:00:57.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.044) 0:00:57.651 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:27:42 +0000 (0:00:00.029) 0:00:57.681 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/bar", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:27:45 +0000 (0:00:02.835) 0:01:00.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:27:45 +0000 (0:00:00.031) 0:01:00.547 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:27:45 +0000 (0:00:00.028) 0:01:00.575 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/bar", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:27:45 +0000 (0:00:00.044) 0:01:00.620 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:27:45 +0000 (0:00:00.036) 0:01:00.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:27:45 +0000 (0:00:00.035) 0:01:00.691 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:27:46 +0000 (0:00:00.383) 0:01:01.075 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:27:46 +0000 (0:00:00.650) 0:01:01.726 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:27:46 +0000 (0:00:00.030) 0:01:01.756 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:27:47 +0000 (0:00:00.660) 0:01:02.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:27:47 +0000 (0:00:00.378) 0:01:02.796 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:27:47 +0000 (0:00:00.028) 0:01:02.824 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:88 Wednesday 01 June 2022 16:27:48 +0000 (0:00:00.850) 0:01:03.675 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:27:48 +0000 (0:00:00.063) 0:01:03.738 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:27:48 +0000 (0:00:00.039) 0:01:03.777 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:27:48 +0000 (0:00:00.029) 0:01:03.807 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:27:49 +0000 (0:00:00.385) 0:01:04.192 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002800", "end": "2022-06-01 12:27:49.183750", "rc": 0, "start": "2022-06-01 12:27:49.180950" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:27:49 +0000 (0:00:00.380) 0:01:04.573 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002881", "end": "2022-06-01 12:27:49.571418", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:27:49.568537" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.388) 0:01:04.962 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.064) 0:01:05.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.030) 0:01:05.057 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.060) 0:01:05.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.042) 0:01:05.160 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.034) 0:01:05.195 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.038) 0:01:05.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.039) 0:01:05.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.037) 0:01:05.310 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.037) 0:01:05.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.031) 0:01:05.379 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.028) 0:01:05.408 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.061) 0:01:05.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.036) 0:01:05.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.031) 0:01:05.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.032) 0:01:05.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.033) 0:01:05.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.031) 0:01:05.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.031) 0:01:05.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.030) 0:01:05.697 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.032) 0:01:05.730 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:27:50 +0000 (0:00:00.059) 0:01:05.789 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.058) 0:01:05.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.029) 0:01:05.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.027) 0:01:05.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.030) 0:01:05.935 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.062) 0:01:05.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.034) 0:01:06.032 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.028) 0:01:06.060 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.026) 0:01:06.086 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.031) 0:01:06.118 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.061) 0:01:06.179 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.061) 0:01:06.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.031) 0:01:06.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.030) 0:01:06.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.029) 0:01:06.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.031) 0:01:06.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.030) 0:01:06.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.030) 0:01:06.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.076) 0:01:06.501 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.038) 0:01:06.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.044) 0:01:06.585 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.062) 0:01:06.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.036) 0:01:06.683 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.118) 0:01:06.802 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:27:51 +0000 (0:00:00.036) 0:01:06.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.040) 0:01:06.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.031) 0:01:06.910 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.035) 0:01:06.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.030) 0:01:06.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.030) 0:01:07.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.030) 0:01:07.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.028) 0:01:07.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.030) 0:01:07.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.045) 0:01:07.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.024) 0:01:07.167 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.034) 0:01:07.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.029) 0:01:07.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.028) 0:01:07.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.027) 0:01:07.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.024) 0:01:07.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.393) 0:01:07.706 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.043) 0:01:07.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.026) 0:01:07.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:27:52 +0000 (0:00:00.033) 0:01:07.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.033) 0:01:07.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.027) 0:01:07.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:07.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:07.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.031) 0:01:07.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.028) 0:01:07.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.032) 0:01:08.024 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.027) 0:01:08.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.029) 0:01:08.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.027) 0:01:08.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.027) 0:01:08.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.034) 0:01:08.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.036) 0:01:08.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.031) 0:01:08.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.031) 0:01:08.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.029) 0:01:08.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:08.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:08.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.032) 0:01:08.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.029) 0:01:08.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.029) 0:01:08.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:08.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.029) 0:01:08.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:08.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.032) 0:01:08.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:08.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.030) 0:01:08.638 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.035) 0:01:08.673 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.033) 0:01:08.707 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.031) 0:01:08.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:27:53 +0000 (0:00:00.077) 0:01:08.815 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.032) 0:01:08.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.029) 0:01:08.877 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.032) 0:01:08.910 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.030) 0:01:08.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.029) 0:01:08.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.032) 0:01:09.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.029) 0:01:09.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.029) 0:01:09.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.029) 0:01:09.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.029) 0:01:09.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.031) 0:01:09.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.033) 0:01:09.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.030) 0:01:09.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.030) 0:01:09.246 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.027) 0:01:09.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=390 changed=7 unreachable=0 failed=0 skipped=305 rescued=0 ignored=0 Wednesday 01 June 2022 16:27:54 +0000 (0:00:00.015) 0:01:09.289 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.18s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:2 ------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : make sure required packages are installed --- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Get the canonical device path for each member device -------------------- 0.84s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Get the canonical device path for each member device -------------------- 0.79s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ Get the canonical device path for each member device -------------------- 0.76s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:27:55 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:27:56 +0000 (0:00:01.273) 0:00:01.296 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_fs_use_partitions_nvme_generated.yml ******************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:27:56 +0000 (0:00:00.017) 0:00:01.313 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:27:57 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:27:58 +0000 (0:00:01.267) 0:00:01.290 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_fs_use_partitions_scsi_generated.yml ******************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions_scsi_generated.yml:3 Wednesday 01 June 2022 16:27:58 +0000 (0:00:00.016) 0:00:01.306 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions_scsi_generated.yml:7 Wednesday 01 June 2022 16:27:59 +0000 (0:00:01.067) 0:00:02.374 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:2 Wednesday 01 June 2022 16:27:59 +0000 (0:00:00.025) 0:00:02.400 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:14 Wednesday 01 June 2022 16:28:00 +0000 (0:00:00.791) 0:00:03.191 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:28:00 +0000 (0:00:00.039) 0:00:03.230 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:28:00 +0000 (0:00:00.158) 0:00:03.389 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:28:01 +0000 (0:00:00.514) 0:00:03.903 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:28:01 +0000 (0:00:00.081) 0:00:03.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:28:01 +0000 (0:00:00.023) 0:00:04.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:28:01 +0000 (0:00:00.021) 0:00:04.031 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:28:01 +0000 (0:00:00.192) 0:00:04.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:28:01 +0000 (0:00:00.018) 0:00:04.242 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:28:02 +0000 (0:00:01.052) 0:00:05.295 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:28:02 +0000 (0:00:00.054) 0:00:05.350 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:28:02 +0000 (0:00:00.047) 0:00:05.397 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:28:03 +0000 (0:00:00.681) 0:00:06.079 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:28:03 +0000 (0:00:00.082) 0:00:06.161 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:28:03 +0000 (0:00:00.021) 0:00:06.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:28:03 +0000 (0:00:00.022) 0:00:06.205 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:28:03 +0000 (0:00:00.019) 0:00:06.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:28:04 +0000 (0:00:00.808) 0:00:07.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:28:06 +0000 (0:00:01.832) 0:00:08.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.043) 0:00:08.908 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.026) 0:00:08.935 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.532) 0:00:09.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.030) 0:00:09.497 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.028) 0:00:09.525 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.034) 0:00:09.560 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.031) 0:00:09.591 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.032) 0:00:09.624 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.025) 0:00:09.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.030) 0:00:09.680 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.027) 0:00:09.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:28:06 +0000 (0:00:00.027) 0:00:09.735 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:28:07 +0000 (0:00:00.459) 0:00:10.194 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:28:07 +0000 (0:00:00.028) 0:00:10.223 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:17 Wednesday 01 June 2022 16:28:08 +0000 (0:00:00.821) 0:00:11.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:24 Wednesday 01 June 2022 16:28:08 +0000 (0:00:00.030) 0:00:11.076 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:28:08 +0000 (0:00:00.044) 0:00:11.120 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:28:08 +0000 (0:00:00.514) 0:00:11.634 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:28:08 +0000 (0:00:00.036) 0:00:11.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:28:08 +0000 (0:00:00.029) 0:00:11.700 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create an LVM partition with the default file system type] *************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:29 Wednesday 01 June 2022 16:28:08 +0000 (0:00:00.032) 0:00:11.733 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.053) 0:00:11.787 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.042) 0:00:11.829 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.497) 0:00:12.327 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.067) 0:00:12.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.029) 0:00:12.424 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.029) 0:00:12.454 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.090) 0:00:12.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.025) 0:00:12.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.031) 0:00:12.602 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.035) 0:00:12.637 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.031) 0:00:12.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.031) 0:00:12.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.029) 0:00:12.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:28:09 +0000 (0:00:00.029) 0:00:12.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:28:10 +0000 (0:00:00.029) 0:00:12.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:28:10 +0000 (0:00:00.043) 0:00:12.831 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:28:10 +0000 (0:00:00.028) 0:00:12.859 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/bar", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/bar-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:28:12 +0000 (0:00:02.854) 0:00:15.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:28:12 +0000 (0:00:00.029) 0:00:15.743 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:28:12 +0000 (0:00:00.030) 0:00:15.774 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/bar", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/bar-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:28:13 +0000 (0:00:00.049) 0:00:15.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:28:13 +0000 (0:00:00.037) 0:00:15.861 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:28:13 +0000 (0:00:00.034) 0:00:15.896 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:28:13 +0000 (0:00:00.030) 0:00:15.926 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:28:14 +0000 (0:00:00.956) 0:00:16.882 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:28:14 +0000 (0:00:00.541) 0:00:17.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:28:15 +0000 (0:00:00.631) 0:00:18.055 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:28:15 +0000 (0:00:00.373) 0:00:18.429 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:28:15 +0000 (0:00:00.029) 0:00:18.458 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:41 Wednesday 01 June 2022 16:28:16 +0000 (0:00:00.828) 0:00:19.287 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:28:16 +0000 (0:00:00.055) 0:00:19.343 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:28:16 +0000 (0:00:00.039) 0:00:19.383 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:28:16 +0000 (0:00:00.028) 0:00:19.411 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/bar-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/bar-test1", "size": "5G", "type": "lvm", "uuid": "64d2baea-38a7-4612-bf7b-43be91e31063" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "kjnjdr-CWnb-0tqR-cxM0-0Tf8-Yjf0-01x382" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "E7MEv2-NU0h-vRZx-vEDX-vJOz-EM2c-mt54nF" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:28:17 +0000 (0:00:00.514) 0:00:19.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002462", "end": "2022-06-01 12:28:17.045934", "rc": 0, "start": "2022-06-01 12:28:17.043472" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/bar-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:28:17 +0000 (0:00:00.455) 0:00:20.381 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003185", "end": "2022-06-01 12:28:17.431616", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:28:17.428431" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:28:17 +0000 (0:00:00.389) 0:00:20.771 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:28:18 +0000 (0:00:00.063) 0:00:20.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:28:18 +0000 (0:00:00.031) 0:00:20.866 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:28:18 +0000 (0:00:00.065) 0:00:20.931 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:28:18 +0000 (0:00:00.038) 0:00:20.969 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:28:18 +0000 (0:00:00.781) 0:00:21.751 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.047) 0:00:21.799 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.036) 0:00:21.835 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.034) 0:00:21.869 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.033) 0:00:21.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.033) 0:00:21.936 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.050) 0:00:21.987 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.057) 0:00:22.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.032) 0:00:22.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.030) 0:00:22.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.029) 0:00:22.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.029) 0:00:22.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.029) 0:00:22.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.029) 0:00:22.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.031) 0:00:22.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.030) 0:00:22.288 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.059) 0:00:22.348 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.061) 0:00:22.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.030) 0:00:22.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.030) 0:00:22.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.029) 0:00:22.500 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.061) 0:00:22.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.068) 0:00:22.631 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.039) 0:00:22.670 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.067) 0:00:22.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:28:19 +0000 (0:00:00.036) 0:00:22.774 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.037) 0:00:22.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.031) 0:00:22.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.029) 0:00:22.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.030) 0:00:22.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.031) 0:00:22.934 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.037) 0:00:22.971 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.034) 0:00:23.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.030) 0:00:23.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.029) 0:00:23.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.029) 0:00:23.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.031) 0:00:23.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.033) 0:00:23.160 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.063) 0:00:23.223 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.065) 0:00:23.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.033) 0:00:23.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.031) 0:00:23.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.032) 0:00:23.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.030) 0:00:23.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.029) 0:00:23.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.032) 0:00:23.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.030) 0:00:23.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.031) 0:00:23.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.031) 0:00:23.572 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.060) 0:00:23.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:28:20 +0000 (0:00:00.034) 0:00:23.667 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.127) 0:00:23.795 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.036) 0:00:23.831 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/bar-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "64d2baea-38a7-4612-bf7b-43be91e31063" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/bar-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "64d2baea-38a7-4612-bf7b-43be91e31063" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.041) 0:00:23.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.041) 0:00:23.913 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.034) 0:00:23.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.038) 0:00:23.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.033) 0:00:24.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.030) 0:00:24.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.030) 0:00:24.080 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.033) 0:00:24.114 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/bar-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.050) 0:00:24.165 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.035) 0:00:24.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.037) 0:00:24.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.031) 0:00:24.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.031) 0:00:24.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.040) 0:00:24.343 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:28:21 +0000 (0:00:00.037) 0:00:24.380 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100892.3051214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100892.3051214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2257, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100892.3051214, "nlink": 1, "path": "/dev/mapper/bar-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.443) 0:00:24.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.038) 0:00:24.862 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.034) 0:00:24.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.032) 0:00:24.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.031) 0:00:24.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.037) 0:00:24.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.029) 0:00:25.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.030) 0:00:25.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.030) 0:00:25.087 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.037) 0:00:25.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.030) 0:00:25.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.032) 0:00:25.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.030) 0:00:25.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.030) 0:00:25.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.030) 0:00:25.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.039) 0:00:25.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.039) 0:00:25.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.035) 0:00:25.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.038) 0:00:25.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.033) 0:00:25.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.033) 0:00:25.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.031) 0:00:25.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.030) 0:00:25.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.035) 0:00:25.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.032) 0:00:25.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.034) 0:00:25.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.036) 0:00:25.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:28:22 +0000 (0:00:00.032) 0:00:25.732 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:28:23 +0000 (0:00:00.470) 0:00:26.203 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:28:23 +0000 (0:00:00.393) 0:00:26.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:28:23 +0000 (0:00:00.039) 0:00:26.636 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:28:23 +0000 (0:00:00.034) 0:00:26.671 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:28:23 +0000 (0:00:00.031) 0:00:26.702 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:28:23 +0000 (0:00:00.032) 0:00:26.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:28:23 +0000 (0:00:00.032) 0:00:26.767 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.032) 0:00:26.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.033) 0:00:26.833 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.035) 0:00:26.868 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.033) 0:00:26.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.039) 0:00:26.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "bar/test1" ], "delta": "0:00:00.033684", "end": "2022-06-01 12:28:24.017676", "rc": 0, "start": "2022-06-01 12:28:23.983992" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.419) 0:00:27.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.039) 0:00:27.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.039) 0:00:27.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.036) 0:00:27.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.033) 0:00:27.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.031) 0:00:27.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.030) 0:00:27.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.029) 0:00:27.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.029) 0:00:27.631 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.061) 0:00:27.693 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the LVM partition file system type to "ext4"] ********************* task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:43 Wednesday 01 June 2022 16:28:24 +0000 (0:00:00.034) 0:00:27.728 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.065) 0:00:27.793 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.049) 0:00:27.842 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.527) 0:00:28.370 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.072) 0:00:28.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.031) 0:00:28.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.031) 0:00:28.507 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.061) 0:00:28.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.025) 0:00:28.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.030) 0:00:28.624 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.043) 0:00:28.668 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.033) 0:00:28.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.030) 0:00:28.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:28:25 +0000 (0:00:00.029) 0:00:28.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:28:26 +0000 (0:00:00.028) 0:00:28.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:28:26 +0000 (0:00:00.028) 0:00:28.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:28:26 +0000 (0:00:00.045) 0:00:28.866 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:28:26 +0000 (0:00:00.027) 0:00:28.894 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:28:28 +0000 (0:00:02.169) 0:00:31.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:28:28 +0000 (0:00:00.032) 0:00:31.096 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:28:28 +0000 (0:00:00.027) 0:00:31.124 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:28:28 +0000 (0:00:00.040) 0:00:31.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:28:28 +0000 (0:00:00.038) 0:00:31.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:28:28 +0000 (0:00:00.034) 0:00:31.238 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:28:28 +0000 (0:00:00.368) 0:00:31.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:28:29 +0000 (0:00:00.667) 0:00:32.273 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:28:29 +0000 (0:00:00.391) 0:00:32.664 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:28:30 +0000 (0:00:00.652) 0:00:33.317 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:28:30 +0000 (0:00:00.368) 0:00:33.685 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:28:30 +0000 (0:00:00.030) 0:00:33.715 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:56 Wednesday 01 June 2022 16:28:31 +0000 (0:00:00.841) 0:00:34.556 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:28:31 +0000 (0:00:00.057) 0:00:34.614 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:28:31 +0000 (0:00:00.039) 0:00:34.654 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:28:31 +0000 (0:00:00.028) 0:00:34.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/bar-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/bar-test1", "size": "5G", "type": "lvm", "uuid": "87f312f3-2325-4720-aca9-1242309210ff" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "kjnjdr-CWnb-0tqR-cxM0-0Tf8-Yjf0-01x382" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "E7MEv2-NU0h-vRZx-vEDX-vJOz-EM2c-mt54nF" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:28:32 +0000 (0:00:00.377) 0:00:35.060 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003576", "end": "2022-06-01 12:28:32.124930", "rc": 0, "start": "2022-06-01 12:28:32.121354" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/bar-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:28:32 +0000 (0:00:00.405) 0:00:35.466 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002555", "end": "2022-06-01 12:28:32.509214", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:28:32.506659" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:28:33 +0000 (0:00:00.379) 0:00:35.846 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:28:33 +0000 (0:00:00.065) 0:00:35.911 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:28:33 +0000 (0:00:00.032) 0:00:35.944 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:28:33 +0000 (0:00:00.061) 0:00:36.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:28:33 +0000 (0:00:00.038) 0:00:36.044 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:28:33 +0000 (0:00:00.733) 0:00:36.778 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.051) 0:00:36.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.038) 0:00:36.867 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.036) 0:00:36.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.037) 0:00:36.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.030) 0:00:36.971 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.050) 0:00:37.022 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.057) 0:00:37.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.029) 0:00:37.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.030) 0:00:37.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.030) 0:00:37.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.031) 0:00:37.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.034) 0:00:37.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.034) 0:00:37.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.030) 0:00:37.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.031) 0:00:37.331 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.057) 0:00:37.389 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.063) 0:00:37.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.033) 0:00:37.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.030) 0:00:37.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.029) 0:00:37.547 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.063) 0:00:37.610 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.036) 0:00:37.647 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.039) 0:00:37.686 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:28:34 +0000 (0:00:00.074) 0:00:37.761 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.039) 0:00:37.800 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.036) 0:00:37.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.031) 0:00:37.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.030) 0:00:37.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.030) 0:00:37.929 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.033) 0:00:37.963 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.034) 0:00:37.998 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.037) 0:00:38.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.031) 0:00:38.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.030) 0:00:38.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.030) 0:00:38.128 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.035) 0:00:38.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.031) 0:00:38.195 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.062) 0:00:38.257 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.065) 0:00:38.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.031) 0:00:38.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.032) 0:00:38.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.031) 0:00:38.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.031) 0:00:38.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.034) 0:00:38.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.078) 0:00:38.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.033) 0:00:38.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.035) 0:00:38.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.031) 0:00:38.664 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.062) 0:00:38.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:28:35 +0000 (0:00:00.036) 0:00:38.762 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.126) 0:00:38.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.035) 0:00:38.924 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "87f312f3-2325-4720-aca9-1242309210ff" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "87f312f3-2325-4720-aca9-1242309210ff" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.041) 0:00:38.966 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.037) 0:00:39.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.037) 0:00:39.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.036) 0:00:39.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.030) 0:00:39.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.030) 0:00:39.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.030) 0:00:39.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.031) 0:00:39.200 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/bar-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.047) 0:00:39.248 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.035) 0:00:39.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.036) 0:00:39.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.028) 0:00:39.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.031) 0:00:39.380 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.038) 0:00:39.419 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:28:36 +0000 (0:00:00.042) 0:00:39.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100907.6681216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100907.6681216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2257, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100907.6681216, "nlink": 1, "path": "/dev/mapper/bar-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.378) 0:00:39.840 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.038) 0:00:39.879 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.036) 0:00:39.915 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.032) 0:00:39.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.029) 0:00:39.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.034) 0:00:40.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.031) 0:00:40.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.029) 0:00:40.103 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.036) 0:00:40.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.032) 0:00:40.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.294 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.037) 0:00:40.331 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.037) 0:00:40.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.031) 0:00:40.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.033) 0:00:40.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.028) 0:00:40.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.030) 0:00:40.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.032) 0:00:40.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.033) 0:00:40.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.036) 0:00:40.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:28:37 +0000 (0:00:00.031) 0:00:40.720 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.357) 0:00:41.077 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.374) 0:00:41.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.038) 0:00:41.489 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.034) 0:00:41.523 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.031) 0:00:41.555 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.030) 0:00:41.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.032) 0:00:41.618 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.031) 0:00:41.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.032) 0:00:41.682 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.034) 0:00:41.717 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:28:38 +0000 (0:00:00.034) 0:00:41.752 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.038) 0:00:41.790 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "bar/test1" ], "delta": "0:00:00.039386", "end": "2022-06-01 12:28:38.850416", "rc": 0, "start": "2022-06-01 12:28:38.811030" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.395) 0:00:42.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.037) 0:00:42.223 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.037) 0:00:42.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.034) 0:00:42.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.034) 0:00:42.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.033) 0:00:42.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.032) 0:00:42.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.034) 0:00:42.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.030) 0:00:42.461 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.028) 0:00:42.490 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:58 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.032) 0:00:42.522 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.078) 0:00:42.600 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:28:39 +0000 (0:00:00.047) 0:00:42.648 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.528) 0:00:43.177 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.076) 0:00:43.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.032) 0:00:43.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.033) 0:00:43.318 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.065) 0:00:43.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.029) 0:00:43.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.032) 0:00:43.445 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.040) 0:00:43.486 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.035) 0:00:43.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.032) 0:00:43.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.031) 0:00:43.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.034) 0:00:43.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.030) 0:00:43.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.048) 0:00:43.700 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:28:40 +0000 (0:00:00.029) 0:00:43.730 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:28:42 +0000 (0:00:01.857) 0:00:45.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:28:42 +0000 (0:00:00.034) 0:00:45.622 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:28:42 +0000 (0:00:00.034) 0:00:45.657 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/bar-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:28:42 +0000 (0:00:00.046) 0:00:45.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:28:42 +0000 (0:00:00.039) 0:00:45.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:28:42 +0000 (0:00:00.037) 0:00:45.781 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:28:43 +0000 (0:00:00.030) 0:00:45.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:28:43 +0000 (0:00:00.690) 0:00:46.502 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:28:44 +0000 (0:00:00.399) 0:00:46.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:28:44 +0000 (0:00:00.651) 0:00:47.553 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:28:45 +0000 (0:00:00.366) 0:00:47.920 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:28:45 +0000 (0:00:00.030) 0:00:47.951 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:71 Wednesday 01 June 2022 16:28:46 +0000 (0:00:00.896) 0:00:48.848 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:28:46 +0000 (0:00:00.060) 0:00:48.908 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:28:46 +0000 (0:00:00.039) 0:00:48.948 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:28:46 +0000 (0:00:00.029) 0:00:48.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/bar-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/bar-test1", "size": "5G", "type": "lvm", "uuid": "87f312f3-2325-4720-aca9-1242309210ff" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "kjnjdr-CWnb-0tqR-cxM0-0Tf8-Yjf0-01x382" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "E7MEv2-NU0h-vRZx-vEDX-vJOz-EM2c-mt54nF" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:28:46 +0000 (0:00:00.378) 0:00:49.357 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002842", "end": "2022-06-01 12:28:46.394258", "rc": 0, "start": "2022-06-01 12:28:46.391416" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/bar-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:28:46 +0000 (0:00:00.379) 0:00:49.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002796", "end": "2022-06-01 12:28:46.772219", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:28:46.769423" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:28:47 +0000 (0:00:00.370) 0:00:50.107 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:28:47 +0000 (0:00:00.065) 0:00:50.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:28:47 +0000 (0:00:00.067) 0:00:50.240 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:28:47 +0000 (0:00:00.060) 0:00:50.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:28:47 +0000 (0:00:00.040) 0:00:50.341 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.733) 0:00:51.074 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.051) 0:00:51.126 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.043) 0:00:51.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.045) 0:00:51.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.038) 0:00:51.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.031) 0:00:51.284 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.053) 0:00:51.338 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.060) 0:00:51.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.033) 0:00:51.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.032) 0:00:51.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.030) 0:00:51.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.032) 0:00:51.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.029) 0:00:51.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.037) 0:00:51.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.033) 0:00:51.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.030) 0:00:51.659 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.058) 0:00:51.717 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:28:48 +0000 (0:00:00.066) 0:00:51.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.032) 0:00:51.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.030) 0:00:51.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.029) 0:00:51.877 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.061) 0:00:51.938 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.035) 0:00:51.974 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.037) 0:00:52.011 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.065) 0:00:52.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.032) 0:00:52.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.034) 0:00:52.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.029) 0:00:52.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.027) 0:00:52.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.029) 0:00:52.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.031) 0:00:52.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.034) 0:00:52.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.035) 0:00:52.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.028) 0:00:52.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.027) 0:00:52.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.030) 0:00:52.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.034) 0:00:52.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.032) 0:00:52.484 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.065) 0:00:52.550 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.107) 0:00:52.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.032) 0:00:52.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.031) 0:00:52.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:28:49 +0000 (0:00:00.032) 0:00:52.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:52.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:52.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.033) 0:00:52.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:52.881 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:52.913 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:52.945 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.062) 0:00:53.007 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.036) 0:00:53.043 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.123) 0:00:53.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.036) 0:00:53.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "87f312f3-2325-4720-aca9-1242309210ff" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/bar-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "87f312f3-2325-4720-aca9-1242309210ff" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.042) 0:00:53.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.038) 0:00:53.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.037) 0:00:53.322 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.038) 0:00:53.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:53.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:53.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.038) 0:00:53.462 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.036) 0:00:53.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/bar-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.059) 0:00:53.558 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.035) 0:00:53.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.036) 0:00:53.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.031) 0:00:53.662 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.032) 0:00:53.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.040) 0:00:53.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:28:50 +0000 (0:00:00.037) 0:00:53.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100907.6681216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100907.6681216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2257, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100907.6681216, "nlink": 1, "path": "/dev/mapper/bar-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.395) 0:00:54.169 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.038) 0:00:54.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.037) 0:00:54.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.034) 0:00:54.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.029) 0:00:54.309 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.037) 0:00:54.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.030) 0:00:54.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.032) 0:00:54.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.030) 0:00:54.441 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.039) 0:00:54.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.030) 0:00:54.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.034) 0:00:54.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.030) 0:00:54.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.031) 0:00:54.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.027) 0:00:54.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.038) 0:00:54.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.037) 0:00:54.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.033) 0:00:54.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:28:51 +0000 (0:00:00.031) 0:00:54.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.031) 0:00:54.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.031) 0:00:54.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.030) 0:00:54.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.030) 0:00:54.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.033) 0:00:54.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.075) 0:00:55.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.031) 0:00:55.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.030) 0:00:55.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.030) 0:00:55.102 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:28:52 +0000 (0:00:00.392) 0:00:55.494 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.396) 0:00:55.891 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.039) 0:00:55.930 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.034) 0:00:55.964 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.032) 0:00:55.997 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.030) 0:00:56.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.030) 0:00:56.059 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.028) 0:00:56.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.029) 0:00:56.117 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.032) 0:00:56.150 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.032) 0:00:56.182 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.037) 0:00:56.220 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "bar/test1" ], "delta": "0:00:00.041344", "end": "2022-06-01 12:28:53.295695", "rc": 0, "start": "2022-06-01 12:28:53.254351" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.413) 0:00:56.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.038) 0:00:56.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.038) 0:00:56.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.033) 0:00:56.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:28:53 +0000 (0:00:00.034) 0:00:56.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.037) 0:00:56.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.032) 0:00:56.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.031) 0:00:56.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.031) 0:00:56.912 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.028) 0:00:56.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:73 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.030) 0:00:56.970 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.077) 0:00:57.048 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.047) 0:00:57.095 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.559) 0:00:57.655 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.073) 0:00:57.728 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:28:54 +0000 (0:00:00.032) 0:00:57.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.032) 0:00:57.792 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.062) 0:00:57.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.028) 0:00:57.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.030) 0:00:57.914 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "bar", "state": "absent", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.084) 0:00:57.999 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.034) 0:00:58.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.035) 0:00:58.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.030) 0:00:58.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.030) 0:00:58.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.029) 0:00:58.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.046) 0:00:58.207 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:28:55 +0000 (0:00:00.028) 0:00:58.235 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/bar", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:28:58 +0000 (0:00:02.877) 0:01:01.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:28:58 +0000 (0:00:00.034) 0:01:01.148 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:28:58 +0000 (0:00:00.029) 0:01:01.177 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/bar-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/bar-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/bar", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:28:58 +0000 (0:00:00.042) 0:01:01.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:28:58 +0000 (0:00:00.038) 0:01:01.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:28:58 +0000 (0:00:00.034) 0:01:01.293 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/bar-test1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/bar-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/bar-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:28:58 +0000 (0:00:00.392) 0:01:01.685 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:28:59 +0000 (0:00:00.644) 0:01:02.330 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:28:59 +0000 (0:00:00.030) 0:01:02.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:29:00 +0000 (0:00:00.669) 0:01:03.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:29:00 +0000 (0:00:00.384) 0:01:03.414 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:29:00 +0000 (0:00:00.032) 0:01:03.446 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:88 Wednesday 01 June 2022 16:29:01 +0000 (0:00:00.835) 0:01:04.282 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:29:01 +0000 (0:00:00.064) 0:01:04.346 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "bar", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/bar-test1", "_mount_id": "/dev/mapper/bar-test1", "_raw_device": "/dev/mapper/bar-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:29:01 +0000 (0:00:00.040) 0:01:04.386 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:29:01 +0000 (0:00:00.029) 0:01:04.416 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:29:02 +0000 (0:00:00.382) 0:01:04.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002561", "end": "2022-06-01 12:29:01.826388", "rc": 0, "start": "2022-06-01 12:29:01.823827" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:29:02 +0000 (0:00:00.366) 0:01:05.165 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002898", "end": "2022-06-01 12:29:02.198598", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:29:02.195700" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:29:02 +0000 (0:00:00.371) 0:01:05.537 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:29:02 +0000 (0:00:00.063) 0:01:05.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:29:02 +0000 (0:00:00.030) 0:01:05.631 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:29:02 +0000 (0:00:00.113) 0:01:05.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:29:02 +0000 (0:00:00.039) 0:01:05.783 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.026) 0:01:05.810 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.027) 0:01:05.838 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.038) 0:01:05.876 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.038) 0:01:05.915 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.037) 0:01:05.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.035) 0:01:05.988 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.031) 0:01:06.020 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.059) 0:01:06.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.029) 0:01:06.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.030) 0:01:06.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.030) 0:01:06.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.030) 0:01:06.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.029) 0:01:06.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.033) 0:01:06.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.030) 0:01:06.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.032) 0:01:06.325 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.058) 0:01:06.383 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.061) 0:01:06.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.030) 0:01:06.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.030) 0:01:06.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.030) 0:01:06.535 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.063) 0:01:06.599 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.036) 0:01:06.635 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.027) 0:01:06.663 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.027) 0:01:06.691 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:29:03 +0000 (0:00:00.031) 0:01:06.723 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.065) 0:01:06.788 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.063) 0:01:06.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.030) 0:01:06.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.030) 0:01:06.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.031) 0:01:06.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.032) 0:01:06.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.029) 0:01:07.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.030) 0:01:07.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.030) 0:01:07.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.031) 0:01:07.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.033) 0:01:07.133 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.058) 0:01:07.191 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.034) 0:01:07.225 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.119) 0:01:07.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/bar-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.034) 0:01:07.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.038) 0:01:07.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.030) 0:01:07.449 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.033) 0:01:07.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.029) 0:01:07.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.028) 0:01:07.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.028) 0:01:07.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.028) 0:01:07.599 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.076) 0:01:07.675 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.045) 0:01:07.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.024) 0:01:07.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:29:04 +0000 (0:00:00.034) 0:01:07.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.029) 0:01:07.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.031) 0:01:07.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.034) 0:01:07.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.024) 0:01:07.900 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.382) 0:01:08.282 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.037) 0:01:08.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.025) 0:01:08.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.032) 0:01:08.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.033) 0:01:08.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.028) 0:01:08.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.033) 0:01:08.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.031) 0:01:08.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.029) 0:01:08.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.024) 0:01:08.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.029) 0:01:08.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.030) 0:01:08.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.034) 0:01:08.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.032) 0:01:08.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.031) 0:01:08.717 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:29:05 +0000 (0:00:00.039) 0:01:08.757 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.036) 0:01:08.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:08.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.032) 0:01:08.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.029) 0:01:08.886 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:08.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.031) 0:01:08.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.029) 0:01:08.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:09.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.032) 0:01:09.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.029) 0:01:09.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.029) 0:01:09.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:09.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:09.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.031) 0:01:09.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.033) 0:01:09.227 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.033) 0:01:09.260 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.031) 0:01:09.291 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.031) 0:01:09.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.029) 0:01:09.352 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.029) 0:01:09.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.031) 0:01:09.414 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.032) 0:01:09.446 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.032) 0:01:09.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.031) 0:01:09.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:09.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:09.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.032) 0:01:09.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.029) 0:01:09.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.028) 0:01:09.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.028) 0:01:09.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.028) 0:01:09.719 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.030) 0:01:09.750 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:29:06 +0000 (0:00:00.031) 0:01:09.781 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:29:07 +0000 (0:00:00.028) 0:01:09.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=392 changed=7 unreachable=0 failed=0 skipped=305 rescued=0 ignored=0 Wednesday 01 June 2022 16:29:07 +0000 (0:00:00.015) 0:01:09.825 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions_scsi_generated.yml:3 ---- linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.96s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.79s /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml:2 ------------------- Get the canonical device path for each member device -------------------- 0.78s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ Get the canonical device path for each member device -------------------- 0.73s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ Get the canonical device path for each member device -------------------- 0.73s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:29:07 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:29:09 +0000 (0:00:01.279) 0:00:01.302 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_mount.yml *********************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_change_mount.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:2 Wednesday 01 June 2022 16:29:09 +0000 (0:00:00.014) 0:00:01.317 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:12 Wednesday 01 June 2022 16:29:10 +0000 (0:00:01.038) 0:00:02.355 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:29:10 +0000 (0:00:00.038) 0:00:02.394 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:29:10 +0000 (0:00:00.149) 0:00:02.543 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:29:10 +0000 (0:00:00.530) 0:00:03.074 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:29:10 +0000 (0:00:00.079) 0:00:03.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:29:10 +0000 (0:00:00.022) 0:00:03.177 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:29:10 +0000 (0:00:00.022) 0:00:03.199 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:29:11 +0000 (0:00:00.191) 0:00:03.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:29:11 +0000 (0:00:00.018) 0:00:03.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:29:12 +0000 (0:00:01.083) 0:00:04.492 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:29:12 +0000 (0:00:00.046) 0:00:04.539 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:29:12 +0000 (0:00:00.048) 0:00:04.588 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:29:13 +0000 (0:00:00.688) 0:00:05.277 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:29:13 +0000 (0:00:00.080) 0:00:05.358 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:29:13 +0000 (0:00:00.020) 0:00:05.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:29:13 +0000 (0:00:00.021) 0:00:05.400 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:29:13 +0000 (0:00:00.019) 0:00:05.420 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:29:13 +0000 (0:00:00.795) 0:00:06.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:29:15 +0000 (0:00:01.805) 0:00:08.022 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:29:15 +0000 (0:00:00.043) 0:00:08.065 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:29:15 +0000 (0:00:00.027) 0:00:08.093 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.521) 0:00:08.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.058) 0:00:08.673 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.027) 0:00:08.701 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.035) 0:00:08.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.033) 0:00:08.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.033) 0:00:08.803 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.027) 0:00:08.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.033) 0:00:08.864 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.028) 0:00:08.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:29:16 +0000 (0:00:00.028) 0:00:08.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:29:17 +0000 (0:00:00.443) 0:00:09.365 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:29:17 +0000 (0:00:00.027) 0:00:09.392 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:15 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.842) 0:00:10.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:22 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.031) 0:00:10.266 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.043) 0:00:10.310 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.497) 0:00:10.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.036) 0:00:10.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.030) 0:00:10.874 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a LVM logical volume mounted at "/opt/test1"] ********************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:27 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.034) 0:00:10.909 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.053) 0:00:10.963 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:29:18 +0000 (0:00:00.041) 0:00:11.004 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.506) 0:00:11.511 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.068) 0:00:11.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.029) 0:00:11.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.028) 0:00:11.638 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.060) 0:00:11.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.024) 0:00:11.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.028) 0:00:11.752 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.034) 0:00:11.786 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.031) 0:00:11.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.062) 0:00:11.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.029) 0:00:11.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.028) 0:00:11.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.027) 0:00:11.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.041) 0:00:12.007 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:29:19 +0000 (0:00:00.027) 0:00:12.034 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:29:21 +0000 (0:00:01.888) 0:00:13.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:29:21 +0000 (0:00:00.067) 0:00:13.991 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:29:21 +0000 (0:00:00.054) 0:00:14.045 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:29:21 +0000 (0:00:00.067) 0:00:14.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:29:21 +0000 (0:00:00.057) 0:00:14.170 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:29:21 +0000 (0:00:00.039) 0:00:14.209 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:29:22 +0000 (0:00:00.033) 0:00:14.243 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:29:22 +0000 (0:00:00.942) 0:00:15.185 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:29:23 +0000 (0:00:00.533) 0:00:15.719 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:29:24 +0000 (0:00:00.640) 0:00:16.359 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:29:24 +0000 (0:00:00.352) 0:00:16.711 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:29:24 +0000 (0:00:00.029) 0:00:16.741 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:39 Wednesday 01 June 2022 16:29:25 +0000 (0:00:00.841) 0:00:17.582 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:29:25 +0000 (0:00:00.052) 0:00:17.635 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:29:25 +0000 (0:00:00.038) 0:00:17.673 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:29:25 +0000 (0:00:00.030) 0:00:17.704 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1FbG22-IrSe-6T4a-IKvt-aljU-I0qy-5LfyB4" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:29:25 +0000 (0:00:00.522) 0:00:18.226 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002491", "end": "2022-06-01 12:29:25.938218", "rc": 0, "start": "2022-06-01 12:29:25.935727" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:29:26 +0000 (0:00:00.491) 0:00:18.717 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003297", "end": "2022-06-01 12:29:26.316099", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:29:26.312802" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:29:26 +0000 (0:00:00.379) 0:00:19.097 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:29:26 +0000 (0:00:00.061) 0:00:19.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:29:26 +0000 (0:00:00.031) 0:00:19.190 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.062) 0:00:19.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.037) 0:00:19.291 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.461) 0:00:19.752 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.041) 0:00:19.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.038) 0:00:19.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.035) 0:00:19.868 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.036) 0:00:19.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.031) 0:00:19.936 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.043) 0:00:19.979 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.058) 0:00:20.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.030) 0:00:20.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.031) 0:00:20.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.037) 0:00:20.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.030) 0:00:20.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:29:27 +0000 (0:00:00.031) 0:00:20.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.030) 0:00:20.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.029) 0:00:20.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.029) 0:00:20.287 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.060) 0:00:20.348 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.060) 0:00:20.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.029) 0:00:20.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.028) 0:00:20.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.028) 0:00:20.496 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.057) 0:00:20.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.034) 0:00:20.588 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.033) 0:00:20.621 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.057) 0:00:20.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.033) 0:00:20.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.034) 0:00:20.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.027) 0:00:20.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.028) 0:00:20.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.030) 0:00:20.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.029) 0:00:20.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.030) 0:00:20.893 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.059) 0:00:20.952 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.060) 0:00:21.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.029) 0:00:21.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.028) 0:00:21.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.030) 0:00:21.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.030) 0:00:21.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:29:28 +0000 (0:00:00.067) 0:00:21.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.032) 0:00:21.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.032) 0:00:21.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.031) 0:00:21.295 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.030) 0:00:21.326 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.060) 0:00:21.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.034) 0:00:21.420 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.124) 0:00:21.545 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.035) 0:00:21.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.044) 0:00:21.624 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.037) 0:00:21.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.034) 0:00:21.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.036) 0:00:21.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.033) 0:00:21.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.036) 0:00:21.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.034) 0:00:21.837 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.032) 0:00:21.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.047) 0:00:21.917 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.033) 0:00:21.951 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.035) 0:00:21.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.029) 0:00:22.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.034) 0:00:22.051 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.037) 0:00:22.088 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:29:29 +0000 (0:00:00.038) 0:00:22.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100960.9711215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100960.9711215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2627, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100960.9711215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.378) 0:00:22.505 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.038) 0:00:22.544 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.035) 0:00:22.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.032) 0:00:22.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.031) 0:00:22.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.034) 0:00:22.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.029) 0:00:22.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.031) 0:00:22.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.031) 0:00:22.770 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.037) 0:00:22.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.032) 0:00:22.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.029) 0:00:22.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.029) 0:00:22.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.029) 0:00:22.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.031) 0:00:22.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.038) 0:00:23.000 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.036) 0:00:23.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.028) 0:00:23.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.029) 0:00:23.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.028) 0:00:23.123 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.030) 0:00:23.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.031) 0:00:23.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:29:30 +0000 (0:00:00.031) 0:00:23.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:29:31 +0000 (0:00:00.030) 0:00:23.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:29:31 +0000 (0:00:00.029) 0:00:23.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:29:31 +0000 (0:00:00.029) 0:00:23.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:29:31 +0000 (0:00:00.029) 0:00:23.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:29:31 +0000 (0:00:00.027) 0:00:23.365 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:29:31 +0000 (0:00:00.508) 0:00:23.874 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.373) 0:00:24.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.038) 0:00:24.286 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.032) 0:00:24.319 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.029) 0:00:24.349 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.030) 0:00:24.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.029) 0:00:24.409 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.030) 0:00:24.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.031) 0:00:24.472 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.033) 0:00:24.505 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.031) 0:00:24.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.037) 0:00:24.574 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036954", "end": "2022-06-01 12:29:32.203565", "rc": 0, "start": "2022-06-01 12:29:32.166611" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.409) 0:00:24.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.039) 0:00:25.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.039) 0:00:25.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.034) 0:00:25.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.036) 0:00:25.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.033) 0:00:25.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:29:32 +0000 (0:00:00.032) 0:00:25.199 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.030) 0:00:25.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.029) 0:00:25.260 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.043) 0:00:25.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the mount location to "/opt/test2"] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:41 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.034) 0:00:25.338 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.065) 0:00:25.404 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.044) 0:00:25.448 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.511) 0:00:25.960 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.075) 0:00:26.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.032) 0:00:26.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.033) 0:00:26.101 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.060) 0:00:26.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.026) 0:00:26.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:29:33 +0000 (0:00:00.032) 0:00:26.220 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test2", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.037) 0:00:26.257 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.034) 0:00:26.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.029) 0:00:26.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.029) 0:00:26.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.030) 0:00:26.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.031) 0:00:26.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.092) 0:00:26.507 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:29:34 +0000 (0:00:00.029) 0:00:26.536 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:29:35 +0000 (0:00:01.315) 0:00:27.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:29:35 +0000 (0:00:00.030) 0:00:27.883 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:29:35 +0000 (0:00:00.029) 0:00:27.912 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:29:35 +0000 (0:00:00.040) 0:00:27.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:29:35 +0000 (0:00:00.039) 0:00:27.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:29:35 +0000 (0:00:00.033) 0:00:28.026 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:29:36 +0000 (0:00:00.408) 0:00:28.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:29:36 +0000 (0:00:00.660) 0:00:29.094 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:29:37 +0000 (0:00:00.416) 0:00:29.511 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:29:37 +0000 (0:00:00.648) 0:00:30.160 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:29:38 +0000 (0:00:00.351) 0:00:30.511 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:29:38 +0000 (0:00:00.030) 0:00:30.542 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:53 Wednesday 01 June 2022 16:29:39 +0000 (0:00:00.847) 0:00:31.389 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:29:39 +0000 (0:00:00.055) 0:00:31.444 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:29:39 +0000 (0:00:00.039) 0:00:31.484 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:29:39 +0000 (0:00:00.030) 0:00:31.514 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1FbG22-IrSe-6T4a-IKvt-aljU-I0qy-5LfyB4" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:29:39 +0000 (0:00:00.385) 0:00:31.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002848", "end": "2022-06-01 12:29:39.495981", "rc": 0, "start": "2022-06-01 12:29:39.493133" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:29:40 +0000 (0:00:00.379) 0:00:32.278 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002590", "end": "2022-06-01 12:29:39.865532", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:29:39.862942" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:29:40 +0000 (0:00:00.365) 0:00:32.644 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:29:40 +0000 (0:00:00.109) 0:00:32.754 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:29:40 +0000 (0:00:00.031) 0:00:32.786 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:29:40 +0000 (0:00:00.062) 0:00:32.848 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:29:40 +0000 (0:00:00.038) 0:00:32.887 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.390) 0:00:33.277 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.042) 0:00:33.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.039) 0:00:33.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.036) 0:00:33.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.033) 0:00:33.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.027) 0:00:33.456 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.041) 0:00:33.498 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.058) 0:00:33.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.030) 0:00:33.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.030) 0:00:33.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.029) 0:00:33.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.032) 0:00:33.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.031) 0:00:33.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.030) 0:00:33.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.030) 0:00:33.772 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.029) 0:00:33.801 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.061) 0:00:33.863 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.061) 0:00:33.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.031) 0:00:33.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.030) 0:00:33.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.031) 0:00:34.018 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.059) 0:00:34.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.036) 0:00:34.114 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.036) 0:00:34.150 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:29:41 +0000 (0:00:00.058) 0:00:34.209 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.036) 0:00:34.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.037) 0:00:34.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.030) 0:00:34.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.030) 0:00:34.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.031) 0:00:34.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.029) 0:00:34.404 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.032) 0:00:34.436 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.063) 0:00:34.500 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.069) 0:00:34.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.031) 0:00:34.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.030) 0:00:34.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.030) 0:00:34.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.032) 0:00:34.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.031) 0:00:34.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.033) 0:00:34.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.037) 0:00:34.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.030) 0:00:34.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.031) 0:00:34.858 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.105) 0:00:34.964 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.036) 0:00:35.000 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.132) 0:00:35.133 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.036) 0:00:35.170 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:29:42 +0000 (0:00:00.042) 0:00:35.212 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.040) 0:00:35.253 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.036) 0:00:35.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.038) 0:00:35.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.030) 0:00:35.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.029) 0:00:35.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.029) 0:00:35.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.032) 0:00:35.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.044) 0:00:35.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.033) 0:00:35.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.035) 0:00:35.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.029) 0:00:35.594 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.031) 0:00:35.625 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.039) 0:00:35.664 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.036) 0:00:35.701 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100960.9711215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100960.9711215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2627, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100960.9711215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.376) 0:00:36.077 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.036) 0:00:36.113 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.041) 0:00:36.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.038) 0:00:36.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:29:43 +0000 (0:00:00.030) 0:00:36.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.039) 0:00:36.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.031) 0:00:36.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.030) 0:00:36.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.029) 0:00:36.354 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.037) 0:00:36.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.029) 0:00:36.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.031) 0:00:36.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.029) 0:00:36.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.029) 0:00:36.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.031) 0:00:36.544 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.039) 0:00:36.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.034) 0:00:36.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.032) 0:00:36.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.030) 0:00:36.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.029) 0:00:36.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.031) 0:00:36.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.031) 0:00:36.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.030) 0:00:36.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.033) 0:00:36.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.031) 0:00:36.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.032) 0:00:36.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.032) 0:00:36.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:29:44 +0000 (0:00:00.029) 0:00:36.962 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.370) 0:00:37.332 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.381) 0:00:37.714 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.037) 0:00:37.752 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.033) 0:00:37.786 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.032) 0:00:37.819 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.031) 0:00:37.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.030) 0:00:37.880 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.030) 0:00:37.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.029) 0:00:37.940 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.033) 0:00:37.974 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.034) 0:00:38.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:29:45 +0000 (0:00:00.038) 0:00:38.048 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038109", "end": "2022-06-01 12:29:45.679647", "rc": 0, "start": "2022-06-01 12:29:45.641538" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.412) 0:00:38.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.038) 0:00:38.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.039) 0:00:38.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.035) 0:00:38.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.035) 0:00:38.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.034) 0:00:38.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.031) 0:00:38.675 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.033) 0:00:38.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.030) 0:00:38.739 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.028) 0:00:38.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:55 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.030) 0:00:38.798 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.077) 0:00:38.876 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:29:46 +0000 (0:00:00.045) 0:00:38.921 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.550) 0:00:39.472 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.071) 0:00:39.543 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.031) 0:00:39.575 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.030) 0:00:39.605 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.063) 0:00:39.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.024) 0:00:39.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.029) 0:00:39.723 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test2", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.035) 0:00:39.759 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.033) 0:00:39.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.030) 0:00:39.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.029) 0:00:39.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.031) 0:00:39.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.030) 0:00:39.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.044) 0:00:39.959 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:29:47 +0000 (0:00:00.031) 0:00:39.990 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:29:49 +0000 (0:00:01.308) 0:00:41.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:29:49 +0000 (0:00:00.031) 0:00:41.331 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:29:49 +0000 (0:00:00.029) 0:00:41.360 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:29:49 +0000 (0:00:00.039) 0:00:41.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:29:49 +0000 (0:00:00.036) 0:00:41.436 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:29:49 +0000 (0:00:00.033) 0:00:41.469 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:29:49 +0000 (0:00:00.028) 0:00:41.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:29:49 +0000 (0:00:00.633) 0:00:42.132 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:29:50 +0000 (0:00:00.397) 0:00:42.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:29:50 +0000 (0:00:00.652) 0:00:43.182 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:29:51 +0000 (0:00:00.366) 0:00:43.549 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:29:51 +0000 (0:00:00.030) 0:00:43.580 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:67 Wednesday 01 June 2022 16:29:52 +0000 (0:00:00.855) 0:00:44.435 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:29:52 +0000 (0:00:00.059) 0:00:44.494 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:29:52 +0000 (0:00:00.040) 0:00:44.534 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:29:52 +0000 (0:00:00.028) 0:00:44.563 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1FbG22-IrSe-6T4a-IKvt-aljU-I0qy-5LfyB4" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:29:52 +0000 (0:00:00.395) 0:00:44.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002829", "end": "2022-06-01 12:29:52.558102", "rc": 0, "start": "2022-06-01 12:29:52.555273" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:29:53 +0000 (0:00:00.382) 0:00:45.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002460", "end": "2022-06-01 12:29:52.927389", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:29:52.924929" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:29:53 +0000 (0:00:00.363) 0:00:45.705 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:29:53 +0000 (0:00:00.121) 0:00:45.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:29:53 +0000 (0:00:00.033) 0:00:45.860 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:29:53 +0000 (0:00:00.073) 0:00:45.934 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:29:53 +0000 (0:00:00.043) 0:00:45.978 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.376) 0:00:46.354 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.041) 0:00:46.396 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.040) 0:00:46.436 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.036) 0:00:46.472 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.037) 0:00:46.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.031) 0:00:46.541 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.047) 0:00:46.589 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.056) 0:00:46.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.030) 0:00:46.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.029) 0:00:46.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.032) 0:00:46.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.029) 0:00:46.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.029) 0:00:46.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.029) 0:00:46.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.029) 0:00:46.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.031) 0:00:46.887 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.062) 0:00:46.950 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.061) 0:00:47.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.031) 0:00:47.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.031) 0:00:47.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.030) 0:00:47.105 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.060) 0:00:47.166 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:29:54 +0000 (0:00:00.034) 0:00:47.200 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.035) 0:00:47.236 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.057) 0:00:47.294 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.036) 0:00:47.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.036) 0:00:47.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.034) 0:00:47.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.030) 0:00:47.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.031) 0:00:47.462 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.030) 0:00:47.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.030) 0:00:47.524 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.065) 0:00:47.589 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.062) 0:00:47.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.030) 0:00:47.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.032) 0:00:47.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.029) 0:00:47.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.029) 0:00:47.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.029) 0:00:47.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.029) 0:00:47.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.030) 0:00:47.864 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.032) 0:00:47.897 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.068) 0:00:47.965 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.060) 0:00:48.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.037) 0:00:48.063 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.123) 0:00:48.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:29:55 +0000 (0:00:00.039) 0:00:48.226 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "742beced-1df4-46a1-9bc7-da3c00be5b8e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.042) 0:00:48.269 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.037) 0:00:48.306 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.036) 0:00:48.343 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.037) 0:00:48.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.029) 0:00:48.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.032) 0:00:48.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.029) 0:00:48.472 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.030) 0:00:48.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.046) 0:00:48.549 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.033) 0:00:48.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.035) 0:00:48.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.032) 0:00:48.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.031) 0:00:48.682 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.038) 0:00:48.721 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.036) 0:00:48.758 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654100960.9711215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654100960.9711215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2627, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654100960.9711215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.406) 0:00:49.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:29:56 +0000 (0:00:00.038) 0:00:49.203 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.034) 0:00:49.238 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.034) 0:00:49.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.034) 0:00:49.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.028) 0:00:49.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.039) 0:00:49.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.030) 0:00:49.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.030) 0:00:49.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.038) 0:00:49.652 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.034) 0:00:49.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.028) 0:00:49.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.028) 0:00:49.773 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.030) 0:00:49.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.035) 0:00:49.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.031) 0:00:49.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.031) 0:00:49.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.029) 0:00:49.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:29:57 +0000 (0:00:00.031) 0:00:50.023 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.395) 0:00:50.419 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.373) 0:00:50.792 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.039) 0:00:50.831 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.033) 0:00:50.865 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.030) 0:00:50.896 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.030) 0:00:50.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.030) 0:00:50.957 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.029) 0:00:50.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.033) 0:00:51.020 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.034) 0:00:51.054 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.033) 0:00:51.087 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:29:58 +0000 (0:00:00.039) 0:00:51.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.043787", "end": "2022-06-01 12:29:58.771793", "rc": 0, "start": "2022-06-01 12:29:58.728006" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.424) 0:00:51.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.037) 0:00:51.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.039) 0:00:51.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.033) 0:00:51.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.032) 0:00:51.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.030) 0:00:51.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.030) 0:00:51.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.030) 0:00:51.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.030) 0:00:51.816 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.028) 0:00:51.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:69 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.029) 0:00:51.874 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.078) 0:00:51.953 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:29:59 +0000 (0:00:00.043) 0:00:51.997 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.515) 0:00:52.513 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.071) 0:00:52.584 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.031) 0:00:52.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.029) 0:00:52.645 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.060) 0:00:52.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.027) 0:00:52.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.031) 0:00:52.764 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test2", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.036) 0:00:52.800 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.034) 0:00:52.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.031) 0:00:52.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.032) 0:00:52.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.033) 0:00:52.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.029) 0:00:52.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.044) 0:00:53.006 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:30:00 +0000 (0:00:00.027) 0:00:53.033 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:30:02 +0000 (0:00:01.844) 0:00:54.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:30:02 +0000 (0:00:00.030) 0:00:54.908 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:30:02 +0000 (0:00:00.029) 0:00:54.938 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:30:02 +0000 (0:00:00.044) 0:00:54.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:30:02 +0000 (0:00:00.037) 0:00:55.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:30:02 +0000 (0:00:00.033) 0:00:55.053 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:30:03 +0000 (0:00:00.380) 0:00:55.433 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:30:03 +0000 (0:00:00.638) 0:00:56.071 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:30:03 +0000 (0:00:00.031) 0:00:56.102 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:30:04 +0000 (0:00:00.629) 0:00:56.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:30:04 +0000 (0:00:00.367) 0:00:57.100 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:30:04 +0000 (0:00:00.030) 0:00:57.130 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:82 Wednesday 01 June 2022 16:30:05 +0000 (0:00:00.826) 0:00:57.957 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:30:05 +0000 (0:00:00.071) 0:00:58.028 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:30:05 +0000 (0:00:00.039) 0:00:58.068 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:30:05 +0000 (0:00:00.029) 0:00:58.097 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:30:06 +0000 (0:00:00.366) 0:00:58.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002566", "end": "2022-06-01 12:30:06.051205", "rc": 0, "start": "2022-06-01 12:30:06.048639" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:30:06 +0000 (0:00:00.370) 0:00:58.834 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002606", "end": "2022-06-01 12:30:06.446800", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:30:06.444194" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.393) 0:00:59.228 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.060) 0:00:59.289 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.029) 0:00:59.319 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.061) 0:00:59.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.039) 0:00:59.419 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.028) 0:00:59.447 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.029) 0:00:59.477 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.070) 0:00:59.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.035) 0:00:59.583 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.034) 0:00:59.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.029) 0:00:59.647 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.027) 0:00:59.675 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.056) 0:00:59.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.031) 0:00:59.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.030) 0:00:59.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.030) 0:00:59.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.028) 0:00:59.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.031) 0:00:59.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.028) 0:00:59.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.027) 0:00:59.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.029) 0:00:59.970 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.055) 0:01:00.026 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.065) 0:01:00.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.036) 0:01:00.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.033) 0:01:00.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:30:07 +0000 (0:00:00.028) 0:01:00.191 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.062) 0:01:00.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.035) 0:01:00.289 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.026) 0:01:00.315 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.028) 0:01:00.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.030) 0:01:00.374 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.063) 0:01:00.437 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.061) 0:01:00.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.030) 0:01:00.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.030) 0:01:00.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.028) 0:01:00.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.028) 0:01:00.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.027) 0:01:00.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.029) 0:01:00.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.030) 0:01:00.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.032) 0:01:00.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.027) 0:01:00.764 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.055) 0:01:00.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.034) 0:01:00.855 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.118) 0:01:00.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.035) 0:01:01.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.038) 0:01:01.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.028) 0:01:01.076 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.034) 0:01:01.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.031) 0:01:01.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.030) 0:01:01.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:30:08 +0000 (0:00:00.031) 0:01:01.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.029) 0:01:01.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.030) 0:01:01.263 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.044) 0:01:01.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.024) 0:01:01.332 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.034) 0:01:01.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.031) 0:01:01.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.099) 0:01:01.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.031) 0:01:01.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.025) 0:01:01.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.378) 0:01:01.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.037) 0:01:01.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.025) 0:01:01.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.034) 0:01:02.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.034) 0:01:02.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.028) 0:01:02.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.030) 0:01:02.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.030) 0:01:02.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.030) 0:01:02.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:30:09 +0000 (0:00:00.027) 0:01:02.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.033) 0:01:02.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.031) 0:01:02.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.031) 0:01:02.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.032) 0:01:02.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.027) 0:01:02.368 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.039) 0:01:02.407 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.037) 0:01:02.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.027) 0:01:02.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.029) 0:01:02.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.030) 0:01:02.532 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.029) 0:01:02.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.029) 0:01:02.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.032) 0:01:02.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.029) 0:01:02.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.029) 0:01:02.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.030) 0:01:02.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.030) 0:01:02.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.030) 0:01:02.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.032) 0:01:02.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.032) 0:01:02.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.030) 0:01:02.871 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.034) 0:01:02.905 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.030) 0:01:02.935 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.031) 0:01:02.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.034) 0:01:03.001 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.031) 0:01:03.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.033) 0:01:03.066 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.035) 0:01:03.102 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.034) 0:01:03.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.029) 0:01:03.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:30:10 +0000 (0:00:00.032) 0:01:03.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.029) 0:01:03.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.029) 0:01:03.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.029) 0:01:03.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.030) 0:01:03.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.029) 0:01:03.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.031) 0:01:03.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.029) 0:01:03.409 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.028) 0:01:03.438 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.027) 0:01:03.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=378 changed=6 unreachable=0 failed=0 skipped=296 rescued=0 ignored=0 Wednesday 01 June 2022 16:30:11 +0000 (0:00:00.014) 0:01:03.480 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.31s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.04s /tmp/tmp7247_7fr/tests/tests_change_mount.yml:2 ------------------------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:30:12 +0000 (0:00:00.026) 0:00:00.026 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:30:13 +0000 (0:00:01.266) 0:00:01.292 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_mount_nvme_generated.yml ******************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_mount_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:30:13 +0000 (0:00:00.017) 0:00:01.310 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:30:14 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:30:15 +0000 (0:00:01.276) 0:00:01.300 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_change_mount_scsi_generated.yml ******************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_change_mount_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount_scsi_generated.yml:3 Wednesday 01 June 2022 16:30:15 +0000 (0:00:00.015) 0:00:01.316 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount_scsi_generated.yml:7 Wednesday 01 June 2022 16:30:16 +0000 (0:00:01.069) 0:00:02.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:2 Wednesday 01 June 2022 16:30:16 +0000 (0:00:00.024) 0:00:02.411 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:12 Wednesday 01 June 2022 16:30:17 +0000 (0:00:00.804) 0:00:03.215 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:30:17 +0000 (0:00:00.039) 0:00:03.255 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:30:17 +0000 (0:00:00.156) 0:00:03.412 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:30:17 +0000 (0:00:00.530) 0:00:03.942 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:30:18 +0000 (0:00:00.077) 0:00:04.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:30:18 +0000 (0:00:00.024) 0:00:04.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:30:18 +0000 (0:00:00.022) 0:00:04.067 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:30:18 +0000 (0:00:00.197) 0:00:04.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:30:18 +0000 (0:00:00.019) 0:00:04.284 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:30:19 +0000 (0:00:01.106) 0:00:05.390 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:30:19 +0000 (0:00:00.046) 0:00:05.437 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:30:19 +0000 (0:00:00.046) 0:00:05.484 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:30:20 +0000 (0:00:00.691) 0:00:06.175 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:30:20 +0000 (0:00:00.080) 0:00:06.256 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:30:20 +0000 (0:00:00.021) 0:00:06.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:30:20 +0000 (0:00:00.023) 0:00:06.301 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:30:20 +0000 (0:00:00.020) 0:00:06.321 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:30:21 +0000 (0:00:00.797) 0:00:07.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:30:22 +0000 (0:00:01.803) 0:00:08.923 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:30:22 +0000 (0:00:00.043) 0:00:08.966 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:30:22 +0000 (0:00:00.026) 0:00:08.992 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.492) 0:00:09.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.030) 0:00:09.515 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.026) 0:00:09.542 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.031) 0:00:09.573 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.030) 0:00:09.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.031) 0:00:09.635 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.026) 0:00:09.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.028) 0:00:09.690 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.027) 0:00:09.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:30:23 +0000 (0:00:00.027) 0:00:09.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:30:24 +0000 (0:00:00.442) 0:00:10.188 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:30:24 +0000 (0:00:00.026) 0:00:10.214 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:15 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.797) 0:00:11.011 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:22 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.030) 0:00:11.042 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.044) 0:00:11.086 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.520) 0:00:11.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.036) 0:00:11.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.029) 0:00:11.673 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a LVM logical volume mounted at "/opt/test1"] ********************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:27 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.034) 0:00:11.707 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.053) 0:00:11.761 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:30:25 +0000 (0:00:00.041) 0:00:11.802 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.516) 0:00:12.319 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.065) 0:00:12.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.028) 0:00:12.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.030) 0:00:12.443 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.093) 0:00:12.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.025) 0:00:12.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.029) 0:00:12.592 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.035) 0:00:12.627 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.030) 0:00:12.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.031) 0:00:12.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.029) 0:00:12.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.027) 0:00:12.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.028) 0:00:12.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.042) 0:00:12.817 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:30:26 +0000 (0:00:00.026) 0:00:12.844 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:30:28 +0000 (0:00:01.821) 0:00:14.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:30:28 +0000 (0:00:00.029) 0:00:14.695 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:30:28 +0000 (0:00:00.027) 0:00:14.723 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:30:28 +0000 (0:00:00.045) 0:00:14.768 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:30:28 +0000 (0:00:00.044) 0:00:14.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:30:28 +0000 (0:00:00.033) 0:00:14.846 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:30:28 +0000 (0:00:00.032) 0:00:14.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:30:29 +0000 (0:00:00.900) 0:00:15.779 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:30:30 +0000 (0:00:00.546) 0:00:16.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:30:30 +0000 (0:00:00.633) 0:00:16.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:30:31 +0000 (0:00:00.358) 0:00:17.318 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:30:31 +0000 (0:00:00.029) 0:00:17.347 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:39 Wednesday 01 June 2022 16:30:32 +0000 (0:00:00.844) 0:00:18.192 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:30:32 +0000 (0:00:00.051) 0:00:18.244 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:30:32 +0000 (0:00:00.038) 0:00:18.282 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:30:32 +0000 (0:00:00.062) 0:00:18.345 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "EG9AE2-cb8E-MgHQ-fMdo-yjiI-bpv5-hV43M9" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:30:32 +0000 (0:00:00.479) 0:00:18.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002427", "end": "2022-06-01 12:30:32.737109", "rc": 0, "start": "2022-06-01 12:30:32.734682" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:30:33 +0000 (0:00:00.472) 0:00:19.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002731", "end": "2022-06-01 12:30:33.111243", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:30:33.108512" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:30:33 +0000 (0:00:00.364) 0:00:19.660 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:30:33 +0000 (0:00:00.058) 0:00:19.719 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:30:33 +0000 (0:00:00.028) 0:00:19.747 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:30:33 +0000 (0:00:00.058) 0:00:19.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:30:33 +0000 (0:00:00.034) 0:00:19.841 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.452) 0:00:20.294 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.039) 0:00:20.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.038) 0:00:20.372 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.034) 0:00:20.407 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.034) 0:00:20.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.028) 0:00:20.470 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.040) 0:00:20.511 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.055) 0:00:20.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.030) 0:00:20.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.027) 0:00:20.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.029) 0:00:20.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.036) 0:00:20.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.035) 0:00:20.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.030) 0:00:20.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.030) 0:00:20.787 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.029) 0:00:20.817 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.059) 0:00:20.877 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.060) 0:00:20.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:30:34 +0000 (0:00:00.030) 0:00:20.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.029) 0:00:20.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.030) 0:00:21.028 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.060) 0:00:21.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.035) 0:00:21.124 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.034) 0:00:21.159 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.103) 0:00:21.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.040) 0:00:21.303 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.036) 0:00:21.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.029) 0:00:21.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.029) 0:00:21.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.029) 0:00:21.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.031) 0:00:21.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.029) 0:00:21.489 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.062) 0:00:21.552 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.069) 0:00:21.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.031) 0:00:21.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.030) 0:00:21.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.030) 0:00:21.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.030) 0:00:21.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.032) 0:00:21.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.031) 0:00:21.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.031) 0:00:21.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.031) 0:00:21.871 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.031) 0:00:21.902 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:30:35 +0000 (0:00:00.063) 0:00:21.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.034) 0:00:22.000 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.127) 0:00:22.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.034) 0:00:22.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.040) 0:00:22.202 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.035) 0:00:22.237 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.033) 0:00:22.271 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.035) 0:00:22.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.028) 0:00:22.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.028) 0:00:22.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.030) 0:00:22.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.029) 0:00:22.424 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.044) 0:00:22.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.033) 0:00:22.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.036) 0:00:22.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.029) 0:00:22.568 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.033) 0:00:22.601 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.037) 0:00:22.639 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:30:36 +0000 (0:00:00.047) 0:00:22.686 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101028.0461216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101028.0461216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2824, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101028.0461216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.393) 0:00:23.080 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.038) 0:00:23.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.036) 0:00:23.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.032) 0:00:23.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.032) 0:00:23.219 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.034) 0:00:23.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.029) 0:00:23.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.029) 0:00:23.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.028) 0:00:23.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.037) 0:00:23.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.081) 0:00:23.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.032) 0:00:23.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.030) 0:00:23.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.029) 0:00:23.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.031) 0:00:23.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.038) 0:00:23.624 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.038) 0:00:23.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.029) 0:00:23.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.029) 0:00:23.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.030) 0:00:23.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.030) 0:00:23.782 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.029) 0:00:23.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.033) 0:00:23.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.039) 0:00:23.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.033) 0:00:23.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.031) 0:00:23.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:30:37 +0000 (0:00:00.030) 0:00:23.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.028) 0:00:24.010 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.455) 0:00:24.465 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.361) 0:00:24.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.038) 0:00:24.866 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.034) 0:00:24.901 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.029) 0:00:24.930 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.031) 0:00:24.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:30:38 +0000 (0:00:00.032) 0:00:24.994 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.030) 0:00:25.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.033) 0:00:25.058 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.034) 0:00:25.092 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.032) 0:00:25.125 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.038) 0:00:25.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035665", "end": "2022-06-01 12:30:39.010383", "rc": 0, "start": "2022-06-01 12:30:38.974718" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.397) 0:00:25.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.039) 0:00:25.601 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.038) 0:00:25.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.032) 0:00:25.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.031) 0:00:25.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.031) 0:00:25.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.039) 0:00:25.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.033) 0:00:25.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.030) 0:00:25.838 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.034) 0:00:25.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the mount location to "/opt/test2"] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:41 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.034) 0:00:25.908 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:30:39 +0000 (0:00:00.065) 0:00:25.974 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.046) 0:00:26.020 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.504) 0:00:26.524 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.072) 0:00:26.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.031) 0:00:26.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.031) 0:00:26.660 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.061) 0:00:26.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.025) 0:00:26.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.030) 0:00:26.777 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test2", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.039) 0:00:26.816 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.032) 0:00:26.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.030) 0:00:26.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.030) 0:00:26.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.030) 0:00:26.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:30:40 +0000 (0:00:00.029) 0:00:26.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:30:41 +0000 (0:00:00.045) 0:00:27.015 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:30:41 +0000 (0:00:00.026) 0:00:27.041 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:30:42 +0000 (0:00:01.307) 0:00:28.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:30:42 +0000 (0:00:00.031) 0:00:28.380 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:30:42 +0000 (0:00:00.027) 0:00:28.407 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:30:42 +0000 (0:00:00.038) 0:00:28.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:30:42 +0000 (0:00:00.036) 0:00:28.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:30:42 +0000 (0:00:00.039) 0:00:28.521 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:30:42 +0000 (0:00:00.400) 0:00:28.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:30:43 +0000 (0:00:00.636) 0:00:29.558 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:30:43 +0000 (0:00:00.428) 0:00:29.987 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:30:44 +0000 (0:00:00.625) 0:00:30.612 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:30:44 +0000 (0:00:00.356) 0:00:30.969 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:30:45 +0000 (0:00:00.030) 0:00:30.999 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:53 Wednesday 01 June 2022 16:30:45 +0000 (0:00:00.852) 0:00:31.852 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:30:45 +0000 (0:00:00.056) 0:00:31.908 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:30:45 +0000 (0:00:00.041) 0:00:31.949 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:30:45 +0000 (0:00:00.033) 0:00:31.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "EG9AE2-cb8E-MgHQ-fMdo-yjiI-bpv5-hV43M9" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:30:46 +0000 (0:00:00.380) 0:00:32.363 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003178", "end": "2022-06-01 12:30:46.201611", "rc": 0, "start": "2022-06-01 12:30:46.198433" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:30:46 +0000 (0:00:00.395) 0:00:32.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002833", "end": "2022-06-01 12:30:46.569233", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:30:46.566400" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.361) 0:00:33.120 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.066) 0:00:33.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.032) 0:00:33.219 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.064) 0:00:33.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.040) 0:00:33.324 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.359) 0:00:33.684 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.042) 0:00:33.727 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.037) 0:00:33.765 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.036) 0:00:33.801 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.037) 0:00:33.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.032) 0:00:33.871 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.051) 0:00:33.923 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:30:47 +0000 (0:00:00.058) 0:00:33.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.033) 0:00:34.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.031) 0:00:34.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.030) 0:00:34.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.032) 0:00:34.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.030) 0:00:34.141 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.030) 0:00:34.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.032) 0:00:34.204 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.031) 0:00:34.235 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.057) 0:00:34.292 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.099) 0:00:34.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.030) 0:00:34.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.030) 0:00:34.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.030) 0:00:34.484 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.062) 0:00:34.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.034) 0:00:34.581 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.033) 0:00:34.615 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.055) 0:00:34.671 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.035) 0:00:34.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.036) 0:00:34.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.029) 0:00:34.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.028) 0:00:34.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.028) 0:00:34.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.029) 0:00:34.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.031) 0:00:34.891 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:30:48 +0000 (0:00:00.065) 0:00:34.956 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.065) 0:00:35.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.032) 0:00:35.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.031) 0:00:35.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.030) 0:00:35.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.031) 0:00:35.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.030) 0:00:35.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.030) 0:00:35.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.031) 0:00:35.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.030) 0:00:35.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.030) 0:00:35.303 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.059) 0:00:35.362 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.037) 0:00:35.400 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.125) 0:00:35.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.035) 0:00:35.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.041) 0:00:35.604 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.037) 0:00:35.641 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.035) 0:00:35.677 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.040) 0:00:35.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.036) 0:00:35.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.036) 0:00:35.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.032) 0:00:35.823 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.033) 0:00:35.857 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.051) 0:00:35.908 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.036) 0:00:35.944 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:30:49 +0000 (0:00:00.037) 0:00:35.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.031) 0:00:36.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.032) 0:00:36.045 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.038) 0:00:36.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.039) 0:00:36.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101028.0461216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101028.0461216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2824, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101028.0461216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.375) 0:00:36.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.091) 0:00:36.590 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.037) 0:00:36.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.034) 0:00:36.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.029) 0:00:36.691 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.033) 0:00:36.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.029) 0:00:36.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.032) 0:00:36.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.030) 0:00:36.818 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.037) 0:00:36.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.030) 0:00:36.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.027) 0:00:36.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.027) 0:00:36.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:30:50 +0000 (0:00:00.033) 0:00:36.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.031) 0:00:37.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.038) 0:00:37.045 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.034) 0:00:37.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.030) 0:00:37.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.031) 0:00:37.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.033) 0:00:37.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.031) 0:00:37.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.031) 0:00:37.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.029) 0:00:37.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.030) 0:00:37.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.029) 0:00:37.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.031) 0:00:37.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.030) 0:00:37.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.030) 0:00:37.422 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:30:51 +0000 (0:00:00.368) 0:00:37.790 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.358) 0:00:38.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.037) 0:00:38.186 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.031) 0:00:38.218 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.028) 0:00:38.247 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.027) 0:00:38.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.029) 0:00:38.304 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.027) 0:00:38.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.027) 0:00:38.360 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.031) 0:00:38.391 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.031) 0:00:38.423 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.038) 0:00:38.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038317", "end": "2022-06-01 12:30:52.325606", "rc": 0, "start": "2022-06-01 12:30:52.287289" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.428) 0:00:38.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.042) 0:00:38.932 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:30:52 +0000 (0:00:00.041) 0:00:38.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.032) 0:00:39.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.032) 0:00:39.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.031) 0:00:39.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.032) 0:00:39.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.033) 0:00:39.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.031) 0:00:39.169 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.028) 0:00:39.198 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:55 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.029) 0:00:39.227 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.067) 0:00:39.294 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.043) 0:00:39.337 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.504) 0:00:39.842 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.072) 0:00:39.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.032) 0:00:39.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:30:53 +0000 (0:00:00.031) 0:00:39.978 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.062) 0:00:40.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.027) 0:00:40.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.028) 0:00:40.096 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test2", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.036) 0:00:40.133 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.035) 0:00:40.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.029) 0:00:40.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.029) 0:00:40.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.034) 0:00:40.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.030) 0:00:40.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.046) 0:00:40.339 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:30:54 +0000 (0:00:00.028) 0:00:40.368 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:30:55 +0000 (0:00:01.265) 0:00:41.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:30:55 +0000 (0:00:00.031) 0:00:41.665 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:30:55 +0000 (0:00:00.028) 0:00:41.693 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:30:55 +0000 (0:00:00.040) 0:00:41.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:30:55 +0000 (0:00:00.037) 0:00:41.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:30:55 +0000 (0:00:00.033) 0:00:41.804 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:30:55 +0000 (0:00:00.028) 0:00:41.833 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:30:56 +0000 (0:00:00.657) 0:00:42.491 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:30:56 +0000 (0:00:00.396) 0:00:42.887 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:30:57 +0000 (0:00:00.628) 0:00:43.515 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:30:57 +0000 (0:00:00.366) 0:00:43.882 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:30:57 +0000 (0:00:00.033) 0:00:43.916 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:67 Wednesday 01 June 2022 16:30:58 +0000 (0:00:00.822) 0:00:44.739 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:30:58 +0000 (0:00:00.061) 0:00:44.801 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:30:58 +0000 (0:00:00.039) 0:00:44.841 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:30:58 +0000 (0:00:00.030) 0:00:44.871 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "EG9AE2-cb8E-MgHQ-fMdo-yjiI-bpv5-hV43M9" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:30:59 +0000 (0:00:00.362) 0:00:45.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002387", "end": "2022-06-01 12:30:59.037718", "rc": 0, "start": "2022-06-01 12:30:59.035331" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:30:59 +0000 (0:00:00.363) 0:00:45.597 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002491", "end": "2022-06-01 12:30:59.407610", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:30:59.405119" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:30:59 +0000 (0:00:00.367) 0:00:45.964 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.068) 0:00:46.032 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.032) 0:00:46.065 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.063) 0:00:46.128 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.039) 0:00:46.168 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.364) 0:00:46.533 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.043) 0:00:46.576 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.039) 0:00:46.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.036) 0:00:46.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.036) 0:00:46.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.030) 0:00:46.718 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.040) 0:00:46.759 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.057) 0:00:46.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.030) 0:00:46.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.029) 0:00:46.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.030) 0:00:46.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.029) 0:00:46.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:31:00 +0000 (0:00:00.032) 0:00:46.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.077) 0:00:47.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.031) 0:00:47.079 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.031) 0:00:47.111 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.058) 0:00:47.170 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.063) 0:00:47.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.031) 0:00:47.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.030) 0:00:47.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.030) 0:00:47.326 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.062) 0:00:47.388 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.037) 0:00:47.426 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.035) 0:00:47.461 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.057) 0:00:47.519 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.037) 0:00:47.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.035) 0:00:47.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.029) 0:00:47.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.028) 0:00:47.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.029) 0:00:47.680 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.032) 0:00:47.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.033) 0:00:47.747 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.062) 0:00:47.809 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.063) 0:00:47.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.029) 0:00:47.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.030) 0:00:47.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.031) 0:00:47.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:31:01 +0000 (0:00:00.030) 0:00:47.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.029) 0:00:48.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.032) 0:00:48.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.030) 0:00:48.088 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.031) 0:00:48.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.031) 0:00:48.151 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.059) 0:00:48.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.034) 0:00:48.245 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.122) 0:00:48.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.033) 0:00:48.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "352a8a93-94c0-441f-839f-33991a5ab6c5" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.040) 0:00:48.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.037) 0:00:48.479 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.038) 0:00:48.518 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.043) 0:00:48.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.030) 0:00:48.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.029) 0:00:48.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.030) 0:00:48.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.031) 0:00:48.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.047) 0:00:48.731 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.033) 0:00:48.764 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.035) 0:00:48.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.029) 0:00:48.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.031) 0:00:48.861 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.042) 0:00:48.904 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:31:02 +0000 (0:00:00.039) 0:00:48.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101028.0461216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101028.0461216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2824, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101028.0461216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.381) 0:00:49.324 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.038) 0:00:49.363 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.036) 0:00:49.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.032) 0:00:49.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.029) 0:00:49.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.035) 0:00:49.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.030) 0:00:49.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.032) 0:00:49.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.029) 0:00:49.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.036) 0:00:49.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.028) 0:00:49.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.028) 0:00:49.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.029) 0:00:49.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.031) 0:00:49.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.030) 0:00:49.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.036) 0:00:49.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.034) 0:00:49.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.029) 0:00:49.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.029) 0:00:49.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.032) 0:00:49.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:31:03 +0000 (0:00:00.031) 0:00:49.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.032) 0:00:50.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.030) 0:00:50.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.030) 0:00:50.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.029) 0:00:50.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.032) 0:00:50.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.031) 0:00:50.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.030) 0:00:50.185 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.365) 0:00:50.550 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.365) 0:00:50.916 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.038) 0:00:50.954 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:31:04 +0000 (0:00:00.034) 0:00:50.989 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.031) 0:00:51.020 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.030) 0:00:51.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.032) 0:00:51.084 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.033) 0:00:51.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.033) 0:00:51.150 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.034) 0:00:51.185 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.034) 0:00:51.219 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.043) 0:00:51.262 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030061", "end": "2022-06-01 12:31:05.104605", "rc": 0, "start": "2022-06-01 12:31:05.074544" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.395) 0:00:51.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.040) 0:00:51.698 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.039) 0:00:51.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.031) 0:00:51.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.031) 0:00:51.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.032) 0:00:51.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.032) 0:00:51.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.034) 0:00:51.900 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:31:05 +0000 (0:00:00.068) 0:00:51.969 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.028) 0:00:51.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:69 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.029) 0:00:52.027 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.079) 0:00:52.107 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.043) 0:00:52.150 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.516) 0:00:52.667 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.073) 0:00:52.741 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.030) 0:00:52.771 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.031) 0:00:52.802 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.075) 0:00:52.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.027) 0:00:52.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.032) 0:00:52.938 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test2", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:31:06 +0000 (0:00:00.039) 0:00:52.977 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:31:07 +0000 (0:00:00.033) 0:00:53.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:31:07 +0000 (0:00:00.031) 0:00:53.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:31:07 +0000 (0:00:00.034) 0:00:53.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:31:07 +0000 (0:00:00.032) 0:00:53.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:31:07 +0000 (0:00:00.031) 0:00:53.141 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:31:07 +0000 (0:00:00.046) 0:00:53.187 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:31:07 +0000 (0:00:00.029) 0:00:53.217 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:31:09 +0000 (0:00:01.875) 0:00:55.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:31:09 +0000 (0:00:00.032) 0:00:55.125 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:31:09 +0000 (0:00:00.027) 0:00:55.153 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:31:09 +0000 (0:00:00.041) 0:00:55.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:31:09 +0000 (0:00:00.039) 0:00:55.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:31:09 +0000 (0:00:00.034) 0:00:55.268 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:31:09 +0000 (0:00:00.377) 0:00:55.646 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:31:10 +0000 (0:00:00.663) 0:00:56.309 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:31:10 +0000 (0:00:00.032) 0:00:56.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:31:10 +0000 (0:00:00.639) 0:00:56.981 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:31:11 +0000 (0:00:00.375) 0:00:57.357 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:31:11 +0000 (0:00:00.029) 0:00:57.386 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_change_mount.yml:82 Wednesday 01 June 2022 16:31:12 +0000 (0:00:00.865) 0:00:58.252 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:31:12 +0000 (0:00:00.061) 0:00:58.313 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:31:12 +0000 (0:00:00.040) 0:00:58.353 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:31:12 +0000 (0:00:00.028) 0:00:58.382 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:31:12 +0000 (0:00:00.378) 0:00:58.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002530", "end": "2022-06-01 12:31:12.577664", "rc": 0, "start": "2022-06-01 12:31:12.575134" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.369) 0:00:59.130 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002467", "end": "2022-06-01 12:31:12.940411", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:31:12.937944" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.366) 0:00:59.497 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.068) 0:00:59.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.032) 0:00:59.598 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.063) 0:00:59.661 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.090) 0:00:59.751 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.029) 0:00:59.781 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.029) 0:00:59.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.038) 0:00:59.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.034) 0:00:59.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.034) 0:00:59.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.032) 0:00:59.951 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:31:13 +0000 (0:00:00.028) 0:00:59.980 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.055) 0:01:00.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.030) 0:01:00.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.032) 0:01:00.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.029) 0:01:00.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.030) 0:01:00.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.028) 0:01:00.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.029) 0:01:00.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.029) 0:01:00.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.035) 0:01:00.281 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.057) 0:01:00.339 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.060) 0:01:00.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.030) 0:01:00.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.029) 0:01:00.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.028) 0:01:00.488 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.059) 0:01:00.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.033) 0:01:00.582 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.028) 0:01:00.610 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.026) 0:01:00.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.029) 0:01:00.667 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.060) 0:01:00.727 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.063) 0:01:00.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.030) 0:01:00.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.030) 0:01:00.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.030) 0:01:00.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.029) 0:01:00.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.030) 0:01:00.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:31:14 +0000 (0:00:00.028) 0:01:00.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.030) 0:01:01.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.033) 0:01:01.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.034) 0:01:01.069 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.062) 0:01:01.132 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.035) 0:01:01.167 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.133) 0:01:01.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.036) 0:01:01.338 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.041) 0:01:01.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.030) 0:01:01.410 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.037) 0:01:01.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.029) 0:01:01.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.029) 0:01:01.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.028) 0:01:01.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.029) 0:01:01.564 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.032) 0:01:01.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.096) 0:01:01.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.027) 0:01:01.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.036) 0:01:01.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.031) 0:01:01.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.033) 0:01:01.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.031) 0:01:01.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:31:15 +0000 (0:00:00.029) 0:01:01.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.363) 0:01:02.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.037) 0:01:02.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.025) 0:01:02.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.033) 0:01:02.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.030) 0:01:02.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.026) 0:01:02.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.032) 0:01:02.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.030) 0:01:02.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.029) 0:01:02.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.025) 0:01:02.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.029) 0:01:02.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.030) 0:01:02.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.033) 0:01:02.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.038) 0:01:02.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.031) 0:01:02.682 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.039) 0:01:02.721 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.036) 0:01:02.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.029) 0:01:02.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.033) 0:01:02.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.031) 0:01:02.853 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.032) 0:01:02.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.030) 0:01:02.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.029) 0:01:02.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:31:16 +0000 (0:00:00.030) 0:01:02.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.033) 0:01:03.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.030) 0:01:03.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.030) 0:01:03.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.030) 0:01:03.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.029) 0:01:03.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.030) 0:01:03.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.033) 0:01:03.195 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.034) 0:01:03.230 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.036) 0:01:03.266 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.038) 0:01:03.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.031) 0:01:03.336 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.030) 0:01:03.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.034) 0:01:03.402 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.036) 0:01:03.438 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.033) 0:01:03.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.029) 0:01:03.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.030) 0:01:03.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.029) 0:01:03.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.033) 0:01:03.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.032) 0:01:03.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.033) 0:01:03.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.031) 0:01:03.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.030) 0:01:03.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.029) 0:01:03.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.033) 0:01:03.785 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.029) 0:01:03.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=380 changed=6 unreachable=0 failed=0 skipped=296 rescued=0 ignored=0 Wednesday 01 June 2022 16:31:17 +0000 (0:00:00.017) 0:01:03.832 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.31s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_change_mount_scsi_generated.yml:3 ---------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.80s /tmp/tmp7247_7fr/tests/tests_change_mount.yml:2 ------------------------------- linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:31:18 +0000 (0:00:00.022) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:31:19 +0000 (0:00:01.266) 0:00:01.289 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_disk_then_remove.yml ************************************ 1 plays in /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:2 Wednesday 01 June 2022 16:31:19 +0000 (0:00:00.013) 0:00:01.302 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:9 Wednesday 01 June 2022 16:31:20 +0000 (0:00:01.085) 0:00:02.388 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:31:21 +0000 (0:00:00.037) 0:00:02.425 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:31:21 +0000 (0:00:00.155) 0:00:02.581 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:31:21 +0000 (0:00:00.514) 0:00:03.095 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:31:21 +0000 (0:00:00.077) 0:00:03.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:31:21 +0000 (0:00:00.023) 0:00:03.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:31:21 +0000 (0:00:00.023) 0:00:03.220 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:31:22 +0000 (0:00:00.192) 0:00:03.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:31:22 +0000 (0:00:00.020) 0:00:03.433 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:31:23 +0000 (0:00:01.262) 0:00:04.696 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:31:23 +0000 (0:00:00.051) 0:00:04.748 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:31:23 +0000 (0:00:00.044) 0:00:04.792 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:31:24 +0000 (0:00:00.684) 0:00:05.477 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:31:24 +0000 (0:00:00.080) 0:00:05.558 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:31:24 +0000 (0:00:00.020) 0:00:05.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:31:24 +0000 (0:00:00.022) 0:00:05.600 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:31:24 +0000 (0:00:00.020) 0:00:05.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:31:24 +0000 (0:00:00.778) 0:00:06.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:31:26 +0000 (0:00:01.775) 0:00:08.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:31:26 +0000 (0:00:00.041) 0:00:08.216 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:31:26 +0000 (0:00:00.026) 0:00:08.243 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.495) 0:00:08.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.057) 0:00:08.796 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.026) 0:00:08.823 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.030) 0:00:08.854 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.030) 0:00:08.885 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.029) 0:00:08.915 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.027) 0:00:08.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.029) 0:00:08.972 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.026) 0:00:08.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:31:27 +0000 (0:00:00.029) 0:00:09.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:31:28 +0000 (0:00:00.464) 0:00:09.493 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:31:28 +0000 (0:00:00.027) 0:00:09.521 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:12 Wednesday 01 June 2022 16:31:28 +0000 (0:00:00.827) 0:00:10.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:19 Wednesday 01 June 2022 16:31:28 +0000 (0:00:00.029) 0:00:10.378 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:31:29 +0000 (0:00:00.043) 0:00:10.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:31:29 +0000 (0:00:00.497) 0:00:10.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:31:29 +0000 (0:00:00.035) 0:00:10.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:31:29 +0000 (0:00:00.030) 0:00:10.984 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk device mounted on "/opt/test1"; specify disks as non-list] *** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:23 Wednesday 01 June 2022 16:31:29 +0000 (0:00:00.033) 0:00:11.017 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:31:29 +0000 (0:00:00.071) 0:00:11.089 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:31:29 +0000 (0:00:00.041) 0:00:11.130 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.524) 0:00:11.655 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.067) 0:00:11.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.029) 0:00:11.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.029) 0:00:11.782 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.058) 0:00:11.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.024) 0:00:11.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.028) 0:00:11.893 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.030) 0:00:11.924 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": "sda", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.035) 0:00:11.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.062) 0:00:12.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.029) 0:00:12.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.029) 0:00:12.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.028) 0:00:12.111 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.040) 0:00:12.152 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:31:30 +0000 (0:00:00.026) 0:00:12.178 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:31:32 +0000 (0:00:01.330) 0:00:13.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:31:32 +0000 (0:00:00.030) 0:00:13.539 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:31:32 +0000 (0:00:00.025) 0:00:13.565 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:31:32 +0000 (0:00:00.035) 0:00:13.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:31:32 +0000 (0:00:00.031) 0:00:13.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:31:32 +0000 (0:00:00.034) 0:00:13.666 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:31:32 +0000 (0:00:00.030) 0:00:13.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:31:33 +0000 (0:00:00.913) 0:00:14.610 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:31:33 +0000 (0:00:00.548) 0:00:15.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:31:34 +0000 (0:00:00.648) 0:00:15.807 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:31:34 +0000 (0:00:00.355) 0:00:16.163 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:31:34 +0000 (0:00:00.029) 0:00:16.192 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:34 Wednesday 01 June 2022 16:31:35 +0000 (0:00:00.823) 0:00:17.015 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:31:35 +0000 (0:00:00.051) 0:00:17.067 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:31:35 +0000 (0:00:00.029) 0:00:17.097 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:31:35 +0000 (0:00:00.036) 0:00:17.134 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:31:36 +0000 (0:00:00.492) 0:00:17.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003156", "end": "2022-06-01 12:31:36.177612", "rc": 0, "start": "2022-06-01 12:31:36.174456" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:31:36 +0000 (0:00:00.525) 0:00:18.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003229", "end": "2022-06-01 12:31:36.566778", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:31:36.563549" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.387) 0:00:18.539 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.028) 0:00:18.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.030) 0:00:18.598 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.058) 0:00:18.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.035) 0:00:18.692 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.152) 0:00:18.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.034) 0:00:18.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.041) 0:00:18.920 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.035) 0:00:18.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.034) 0:00:18.990 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.035) 0:00:19.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.033) 0:00:19.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.030) 0:00:19.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.028) 0:00:19.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.028) 0:00:19.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.045) 0:00:19.192 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.032) 0:00:19.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.038) 0:00:19.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.035) 0:00:19.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.031) 0:00:19.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.036) 0:00:19.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:31:37 +0000 (0:00:00.035) 0:00:19.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101091.4811215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101091.4811215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101091.4811215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.378) 0:00:19.780 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.036) 0:00:19.817 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.035) 0:00:19.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.033) 0:00:19.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.028) 0:00:19.915 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.031) 0:00:19.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.025) 0:00:19.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.026) 0:00:19.999 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.029) 0:00:20.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.036) 0:00:20.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.029) 0:00:20.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.027) 0:00:20.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.028) 0:00:20.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.028) 0:00:20.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.028) 0:00:20.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.035) 0:00:20.243 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.033) 0:00:20.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.029) 0:00:20.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.028) 0:00:20.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.026) 0:00:20.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:31:38 +0000 (0:00:00.027) 0:00:20.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.028) 0:00:20.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.028) 0:00:20.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.027) 0:00:20.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.660 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.027) 0:00:20.688 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.026) 0:00:20.715 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.028) 0:00:20.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.029) 0:00:20.773 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.028) 0:00:20.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.028) 0:00:20.830 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.032) 0:00:20.863 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.030) 0:00:20.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.060) 0:00:20.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.029) 0:00:20.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.029) 0:00:21.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.030) 0:00:21.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.030) 0:00:21.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.028) 0:00:21.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.031) 0:00:21.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.029) 0:00:21.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.029) 0:00:21.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation minus fs_type to verify idempotence] ****** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:36 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.029) 0:00:21.222 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.075) 0:00:21.298 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:31:39 +0000 (0:00:00.044) 0:00:21.343 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.669) 0:00:22.013 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.068) 0:00:22.081 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.030) 0:00:22.111 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.028) 0:00:22.139 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.057) 0:00:22.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.025) 0:00:22.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.030) 0:00:22.253 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.032) 0:00:22.285 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.034) 0:00:22.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.029) 0:00:22.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.030) 0:00:22.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:31:40 +0000 (0:00:00.028) 0:00:22.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:31:41 +0000 (0:00:00.028) 0:00:22.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:31:41 +0000 (0:00:00.043) 0:00:22.480 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:31:41 +0000 (0:00:00.026) 0:00:22.507 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:31:42 +0000 (0:00:01.026) 0:00:23.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:31:42 +0000 (0:00:00.030) 0:00:23.564 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:31:42 +0000 (0:00:00.027) 0:00:23.591 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:31:42 +0000 (0:00:00.036) 0:00:23.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:31:42 +0000 (0:00:00.033) 0:00:23.660 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:31:42 +0000 (0:00:00.034) 0:00:23.694 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:31:42 +0000 (0:00:00.029) 0:00:23.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:31:42 +0000 (0:00:00.688) 0:00:24.412 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:31:43 +0000 (0:00:00.386) 0:00:24.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:31:44 +0000 (0:00:00.644) 0:00:25.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:31:44 +0000 (0:00:00.358) 0:00:25.801 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:31:44 +0000 (0:00:00.029) 0:00:25.831 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert file system is preserved on existing partition volume] ************ task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:46 Wednesday 01 June 2022 16:31:45 +0000 (0:00:00.818) 0:00:26.649 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:51 Wednesday 01 June 2022 16:31:45 +0000 (0:00:00.039) 0:00:26.689 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:31:45 +0000 (0:00:00.054) 0:00:26.743 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:31:45 +0000 (0:00:00.030) 0:00:26.773 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:31:45 +0000 (0:00:00.037) 0:00:26.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:31:45 +0000 (0:00:00.357) 0:00:27.168 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002396", "end": "2022-06-01 12:31:45.562276", "rc": 0, "start": "2022-06-01 12:31:45.559880" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.369) 0:00:27.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002417", "end": "2022-06-01 12:31:45.929332", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:31:45.926915" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.359) 0:00:27.897 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.029) 0:00:27.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.031) 0:00:27.958 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.068) 0:00:28.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.044) 0:00:28.071 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.155) 0:00:28.226 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.034) 0:00:28.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.038) 0:00:28.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.039) 0:00:28.338 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.033) 0:00:28.372 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:31:46 +0000 (0:00:00.035) 0:00:28.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.029) 0:00:28.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.030) 0:00:28.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.028) 0:00:28.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.029) 0:00:28.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.045) 0:00:28.571 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.033) 0:00:28.605 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.034) 0:00:28.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.029) 0:00:28.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.030) 0:00:28.700 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.038) 0:00:28.738 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.037) 0:00:28.775 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101091.4811215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101091.4811215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101091.4811215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.373) 0:00:29.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.035) 0:00:29.184 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.034) 0:00:29.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.032) 0:00:29.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.036) 0:00:29.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.036) 0:00:29.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.031) 0:00:29.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:31:47 +0000 (0:00:00.030) 0:00:29.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:29.415 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.036) 0:00:29.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.029) 0:00:29.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.030) 0:00:29.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:29.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:29.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:29.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.035) 0:00:29.632 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.034) 0:00:29.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.038) 0:00:29.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.033) 0:00:29.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.030) 0:00:29.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.030) 0:00:29.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.029) 0:00:29.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.029) 0:00:29.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.031) 0:00:29.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.029) 0:00:29.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.029) 0:00:29.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.030) 0:00:29.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.031) 0:00:30.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:30.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.032) 0:00:30.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.031) 0:00:30.104 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.032) 0:00:30.136 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:30.164 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.030) 0:00:30.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:30.224 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.032) 0:00:30.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.029) 0:00:30.286 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.033) 0:00:30.319 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.032) 0:00:30.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.029) 0:00:30.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:31:48 +0000 (0:00:00.028) 0:00:30.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.031) 0:00:30.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.067) 0:00:30.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.029) 0:00:30.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.029) 0:00:30.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.029) 0:00:30.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.030) 0:00:30.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.029) 0:00:30.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the disk device created above] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:53 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.032) 0:00:30.690 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.069) 0:00:30.759 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.046) 0:00:30.806 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.518) 0:00:31.325 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:31:49 +0000 (0:00:00.070) 0:00:31.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.029) 0:00:31.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.027) 0:00:31.453 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.060) 0:00:31.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.025) 0:00:31.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.029) 0:00:31.568 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.031) 0:00:31.600 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.036) 0:00:31.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.029) 0:00:31.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.029) 0:00:31.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.029) 0:00:31.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.028) 0:00:31.753 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.042) 0:00:31.796 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:31:50 +0000 (0:00:00.042) 0:00:31.838 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:31:51 +0000 (0:00:01.313) 0:00:33.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:31:51 +0000 (0:00:00.029) 0:00:33.182 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:31:51 +0000 (0:00:00.026) 0:00:33.209 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:31:51 +0000 (0:00:00.036) 0:00:33.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:31:51 +0000 (0:00:00.033) 0:00:33.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:31:51 +0000 (0:00:00.034) 0:00:33.312 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:31:52 +0000 (0:00:00.375) 0:00:33.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:31:52 +0000 (0:00:00.647) 0:00:34.336 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:31:52 +0000 (0:00:00.067) 0:00:34.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:31:53 +0000 (0:00:00.639) 0:00:35.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:31:53 +0000 (0:00:00.366) 0:00:35.409 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:31:54 +0000 (0:00:00.027) 0:00:35.437 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:64 Wednesday 01 June 2022 16:31:54 +0000 (0:00:00.802) 0:00:36.239 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:31:54 +0000 (0:00:00.059) 0:00:36.299 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:31:54 +0000 (0:00:00.030) 0:00:36.329 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=261b87a1-1a05-4e3a-9527-d2e7d9bba3b1", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:31:54 +0000 (0:00:00.037) 0:00:36.367 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:31:55 +0000 (0:00:00.376) 0:00:36.743 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002579", "end": "2022-06-01 12:31:55.135556", "rc": 0, "start": "2022-06-01 12:31:55.132977" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:31:55 +0000 (0:00:00.361) 0:00:37.104 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003334", "end": "2022-06-01 12:31:55.510905", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:31:55.507571" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.379) 0:00:37.484 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.028) 0:00:37.513 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.031) 0:00:37.544 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.059) 0:00:37.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.033) 0:00:37.637 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.107) 0:00:37.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.033) 0:00:37.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.039) 0:00:37.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.028) 0:00:37.846 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.034) 0:00:37.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.031) 0:00:37.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.028) 0:00:37.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.028) 0:00:37.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.029) 0:00:37.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.029) 0:00:38.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.053) 0:00:38.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.026) 0:00:38.107 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.036) 0:00:38.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.029) 0:00:38.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.029) 0:00:38.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.029) 0:00:38.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:31:56 +0000 (0:00:00.024) 0:00:38.257 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101111.1081214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101111.1081214, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101111.1081214, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.373) 0:00:38.631 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.036) 0:00:38.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.024) 0:00:38.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.032) 0:00:38.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.028) 0:00:38.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.024) 0:00:38.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.027) 0:00:38.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.028) 0:00:38.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.029) 0:00:38.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.023) 0:00:38.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.027) 0:00:38.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.028) 0:00:38.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.029) 0:00:38.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.028) 0:00:39.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.031) 0:00:39.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.036) 0:00:39.069 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.034) 0:00:39.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.029) 0:00:39.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.027) 0:00:39.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.027) 0:00:39.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.030) 0:00:39.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.029) 0:00:39.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.030) 0:00:39.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.035) 0:00:39.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.029) 0:00:39.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.030) 0:00:39.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:31:57 +0000 (0:00:00.030) 0:00:39.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.030) 0:00:39.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.029) 0:00:39.524 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.032) 0:00:39.556 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.031) 0:00:39.587 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.645 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.702 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.031) 0:00:39.734 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.033) 0:00:39.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.028) 0:00:39.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.034) 0:00:39.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.030) 0:00:39.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.035) 0:00:39.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.029) 0:00:39.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.027) 0:00:40.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.031) 0:00:40.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=200 changed=4 unreachable=0 failed=0 skipped=177 rescued=0 ignored=0 Wednesday 01 June 2022 16:31:58 +0000 (0:00:00.014) 0:00:40.055 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.33s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.31s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:2 -------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:31:59 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:32:00 +0000 (0:00:01.247) 0:00:01.270 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_disk_then_remove_nvme_generated.yml ********************* 2 plays in /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:32:00 +0000 (0:00:00.017) 0:00:01.287 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:32:01 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:32:02 +0000 (0:00:01.267) 0:00:01.290 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_disk_then_remove_scsi_generated.yml ********************* 2 plays in /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove_scsi_generated.yml:3 Wednesday 01 June 2022 16:32:02 +0000 (0:00:00.015) 0:00:01.306 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove_scsi_generated.yml:7 Wednesday 01 June 2022 16:32:03 +0000 (0:00:01.089) 0:00:02.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:2 Wednesday 01 June 2022 16:32:03 +0000 (0:00:00.025) 0:00:02.421 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:9 Wednesday 01 June 2022 16:32:04 +0000 (0:00:00.796) 0:00:03.218 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:32:04 +0000 (0:00:00.038) 0:00:03.256 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:32:04 +0000 (0:00:00.151) 0:00:03.407 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:32:05 +0000 (0:00:00.535) 0:00:03.943 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:32:05 +0000 (0:00:00.073) 0:00:04.017 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:32:05 +0000 (0:00:00.023) 0:00:04.040 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:32:05 +0000 (0:00:00.022) 0:00:04.062 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:32:05 +0000 (0:00:00.192) 0:00:04.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:32:05 +0000 (0:00:00.020) 0:00:04.275 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:32:06 +0000 (0:00:01.056) 0:00:05.332 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:32:06 +0000 (0:00:00.045) 0:00:05.378 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:32:06 +0000 (0:00:00.045) 0:00:05.424 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:32:07 +0000 (0:00:00.681) 0:00:06.105 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:32:07 +0000 (0:00:00.079) 0:00:06.184 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:32:07 +0000 (0:00:00.021) 0:00:06.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:32:07 +0000 (0:00:00.024) 0:00:06.230 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:32:07 +0000 (0:00:00.020) 0:00:06.251 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:32:08 +0000 (0:00:00.805) 0:00:07.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:32:10 +0000 (0:00:01.767) 0:00:08.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.044) 0:00:08.868 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.026) 0:00:08.895 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.521) 0:00:09.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.028) 0:00:09.445 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.024) 0:00:09.470 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.030) 0:00:09.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.029) 0:00:09.529 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.032) 0:00:09.562 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.027) 0:00:09.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:32:10 +0000 (0:00:00.030) 0:00:09.620 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:32:11 +0000 (0:00:00.027) 0:00:09.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:32:11 +0000 (0:00:00.027) 0:00:09.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:32:11 +0000 (0:00:00.448) 0:00:10.123 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:32:11 +0000 (0:00:00.027) 0:00:10.150 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:12 Wednesday 01 June 2022 16:32:12 +0000 (0:00:00.885) 0:00:11.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:19 Wednesday 01 June 2022 16:32:12 +0000 (0:00:00.029) 0:00:11.066 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:32:12 +0000 (0:00:00.043) 0:00:11.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.529) 0:00:11.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.036) 0:00:11.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.029) 0:00:11.705 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk device mounted on "/opt/test1"; specify disks as non-list] *** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:23 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.032) 0:00:11.738 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.052) 0:00:11.791 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.040) 0:00:11.831 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.495) 0:00:12.326 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.068) 0:00:12.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.030) 0:00:12.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.028) 0:00:12.454 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.098) 0:00:12.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.025) 0:00:12.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:32:13 +0000 (0:00:00.029) 0:00:12.607 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.031) 0:00:12.638 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": "sda", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.034) 0:00:12.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.029) 0:00:12.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.028) 0:00:12.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.028) 0:00:12.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.029) 0:00:12.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.040) 0:00:12.829 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:32:14 +0000 (0:00:00.028) 0:00:12.858 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:32:15 +0000 (0:00:01.254) 0:00:14.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:32:15 +0000 (0:00:00.028) 0:00:14.141 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:32:15 +0000 (0:00:00.028) 0:00:14.170 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:32:15 +0000 (0:00:00.036) 0:00:14.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:32:15 +0000 (0:00:00.032) 0:00:14.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:32:15 +0000 (0:00:00.033) 0:00:14.274 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:32:15 +0000 (0:00:00.031) 0:00:14.305 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:32:16 +0000 (0:00:00.901) 0:00:15.207 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=7d5bb690-4e09-4ee5-b431-32ba43144542', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:32:17 +0000 (0:00:00.540) 0:00:15.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:32:17 +0000 (0:00:00.636) 0:00:16.383 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:32:18 +0000 (0:00:00.371) 0:00:16.755 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:32:18 +0000 (0:00:00.028) 0:00:16.784 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:34 Wednesday 01 June 2022 16:32:18 +0000 (0:00:00.800) 0:00:17.584 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:32:19 +0000 (0:00:00.050) 0:00:17.635 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:32:19 +0000 (0:00:00.029) 0:00:17.664 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:32:19 +0000 (0:00:00.036) 0:00:17.700 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7d5bb690-4e09-4ee5-b431-32ba43144542" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:32:19 +0000 (0:00:00.497) 0:00:18.198 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003315", "end": "2022-06-01 12:32:19.531336", "rc": 0, "start": "2022-06-01 12:32:19.528021" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=7d5bb690-4e09-4ee5-b431-32ba43144542 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.518) 0:00:18.717 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002874", "end": "2022-06-01 12:32:19.898158", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:32:19.895284" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.362) 0:00:19.079 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.028) 0:00:19.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.029) 0:00:19.136 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.058) 0:00:19.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.035) 0:00:19.230 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.119) 0:00:19.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.037) 0:00:19.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "7d5bb690-4e09-4ee5-b431-32ba43144542" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "7d5bb690-4e09-4ee5-b431-32ba43144542" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.040) 0:00:19.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.036) 0:00:19.465 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.033) 0:00:19.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.034) 0:00:19.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.029) 0:00:19.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.030) 0:00:19.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:32:20 +0000 (0:00:00.027) 0:00:19.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.030) 0:00:19.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.043) 0:00:19.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.033) 0:00:19.729 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.034) 0:00:19.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.029) 0:00:19.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.034) 0:00:19.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.038) 0:00:19.866 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.035) 0:00:19.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101134.8791215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101134.8791215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101134.8791215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.398) 0:00:20.300 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.037) 0:00:20.338 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.035) 0:00:20.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.035) 0:00:20.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.030) 0:00:20.439 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.034) 0:00:20.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.029) 0:00:20.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.029) 0:00:20.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.028) 0:00:20.561 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:32:21 +0000 (0:00:00.038) 0:00:20.600 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.030) 0:00:20.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.029) 0:00:20.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.029) 0:00:20.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.028) 0:00:20.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.028) 0:00:20.746 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.036) 0:00:20.782 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.032) 0:00:20.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.028) 0:00:20.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.028) 0:00:20.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.027) 0:00:20.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.029) 0:00:20.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.066) 0:00:20.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.030) 0:00:21.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.030) 0:00:21.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.029) 0:00:21.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.028) 0:00:21.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.030) 0:00:21.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.037) 0:00:21.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.032) 0:00:21.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.030) 0:00:21.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.031) 0:00:21.276 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.031) 0:00:21.308 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.029) 0:00:21.337 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.031) 0:00:21.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.030) 0:00:21.399 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.029) 0:00:21.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.030) 0:00:21.459 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.036) 0:00:21.495 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.035) 0:00:21.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.040) 0:00:21.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:32:22 +0000 (0:00:00.031) 0:00:21.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.029) 0:00:21.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.030) 0:00:21.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.030) 0:00:21.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.030) 0:00:21.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.031) 0:00:21.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.030) 0:00:21.786 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.029) 0:00:21.816 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation minus fs_type to verify idempotence] ****** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:36 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.029) 0:00:21.845 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.062) 0:00:21.908 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.042) 0:00:21.950 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.500) 0:00:22.450 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.077) 0:00:22.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.032) 0:00:22.561 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:32:23 +0000 (0:00:00.030) 0:00:22.591 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.062) 0:00:22.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.029) 0:00:22.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.028) 0:00:22.712 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.033) 0:00:22.745 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.037) 0:00:22.782 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.030) 0:00:22.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.029) 0:00:22.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.032) 0:00:22.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.029) 0:00:22.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.045) 0:00:22.949 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:32:24 +0000 (0:00:00.028) 0:00:22.977 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:32:25 +0000 (0:00:01.040) 0:00:24.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:32:25 +0000 (0:00:00.030) 0:00:24.048 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:32:25 +0000 (0:00:00.029) 0:00:24.078 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:32:25 +0000 (0:00:00.036) 0:00:24.114 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:32:25 +0000 (0:00:00.033) 0:00:24.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:32:25 +0000 (0:00:00.036) 0:00:24.183 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:32:25 +0000 (0:00:00.037) 0:00:24.221 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:32:26 +0000 (0:00:00.644) 0:00:24.865 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=7d5bb690-4e09-4ee5-b431-32ba43144542', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:32:26 +0000 (0:00:00.379) 0:00:25.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:32:27 +0000 (0:00:00.652) 0:00:25.898 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:32:27 +0000 (0:00:00.359) 0:00:26.257 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:32:27 +0000 (0:00:00.030) 0:00:26.288 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert file system is preserved on existing partition volume] ************ task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:46 Wednesday 01 June 2022 16:32:28 +0000 (0:00:00.855) 0:00:27.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:51 Wednesday 01 June 2022 16:32:28 +0000 (0:00:00.035) 0:00:27.178 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:32:28 +0000 (0:00:00.055) 0:00:27.234 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:32:28 +0000 (0:00:00.030) 0:00:27.264 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:32:28 +0000 (0:00:00.037) 0:00:27.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7d5bb690-4e09-4ee5-b431-32ba43144542" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:32:29 +0000 (0:00:00.385) 0:00:27.687 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002442", "end": "2022-06-01 12:32:28.865961", "rc": 0, "start": "2022-06-01 12:32:28.863519" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=7d5bb690-4e09-4ee5-b431-32ba43144542 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:32:29 +0000 (0:00:00.359) 0:00:28.046 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002583", "end": "2022-06-01 12:32:29.238033", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:32:29.235450" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:32:29 +0000 (0:00:00.374) 0:00:28.421 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:32:29 +0000 (0:00:00.027) 0:00:28.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:32:29 +0000 (0:00:00.030) 0:00:28.478 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:32:29 +0000 (0:00:00.060) 0:00:28.539 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:32:29 +0000 (0:00:00.035) 0:00:28.575 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.148) 0:00:28.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.036) 0:00:28.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "7d5bb690-4e09-4ee5-b431-32ba43144542" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "7d5bb690-4e09-4ee5-b431-32ba43144542" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.041) 0:00:28.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.038) 0:00:28.840 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.034) 0:00:28.874 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.039) 0:00:28.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.029) 0:00:28.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.028) 0:00:28.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.029) 0:00:29.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.031) 0:00:29.032 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.044) 0:00:29.076 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.032) 0:00:29.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.035) 0:00:29.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.030) 0:00:29.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.032) 0:00:29.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.038) 0:00:29.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:32:30 +0000 (0:00:00.039) 0:00:29.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101134.8791215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101134.8791215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101134.8791215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.379) 0:00:29.665 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.041) 0:00:29.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.036) 0:00:29.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.031) 0:00:29.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.028) 0:00:29.803 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.035) 0:00:29.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.030) 0:00:29.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.030) 0:00:29.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:29.929 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.035) 0:00:29.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:29.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.031) 0:00:30.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.028) 0:00:30.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.028) 0:00:30.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:30.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.036) 0:00:30.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.034) 0:00:30.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.030) 0:00:30.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.031) 0:00:30.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.031) 0:00:30.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.031) 0:00:30.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.030) 0:00:30.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:30.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.032) 0:00:30.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.030) 0:00:30.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:30.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:30.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:30.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.029) 0:00:30.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.031) 0:00:30.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:32:31 +0000 (0:00:00.030) 0:00:30.614 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.032) 0:00:30.647 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.029) 0:00:30.676 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.030) 0:00:30.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.029) 0:00:30.736 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.031) 0:00:30.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.029) 0:00:30.797 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.032) 0:00:30.830 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.031) 0:00:30.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.028) 0:00:30.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.027) 0:00:30.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.031) 0:00:30.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.066) 0:00:31.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.029) 0:00:31.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.029) 0:00:31.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.028) 0:00:31.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.028) 0:00:31.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.035) 0:00:31.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the disk device created above] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:53 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.033) 0:00:31.201 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.070) 0:00:31.271 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:32:32 +0000 (0:00:00.049) 0:00:31.320 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.532) 0:00:31.853 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.069) 0:00:31.923 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.030) 0:00:31.954 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.031) 0:00:31.985 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.060) 0:00:32.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.025) 0:00:32.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.030) 0:00:32.100 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.033) 0:00:32.133 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.034) 0:00:32.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.029) 0:00:32.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.029) 0:00:32.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.029) 0:00:32.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.027) 0:00:32.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.042) 0:00:32.326 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:32:33 +0000 (0:00:00.025) 0:00:32.351 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:32:35 +0000 (0:00:01.302) 0:00:33.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:32:35 +0000 (0:00:00.029) 0:00:33.684 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:32:35 +0000 (0:00:00.028) 0:00:33.712 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:32:35 +0000 (0:00:00.038) 0:00:33.751 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:32:35 +0000 (0:00:00.034) 0:00:33.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:32:35 +0000 (0:00:00.036) 0:00:33.822 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=7d5bb690-4e09-4ee5-b431-32ba43144542', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:32:35 +0000 (0:00:00.446) 0:00:34.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:32:36 +0000 (0:00:00.671) 0:00:34.939 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:32:36 +0000 (0:00:00.030) 0:00:34.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:32:37 +0000 (0:00:00.678) 0:00:35.649 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:32:37 +0000 (0:00:00.356) 0:00:36.005 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:32:37 +0000 (0:00:00.029) 0:00:36.035 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:64 Wednesday 01 June 2022 16:32:38 +0000 (0:00:00.842) 0:00:36.878 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:32:38 +0000 (0:00:00.060) 0:00:36.938 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:32:38 +0000 (0:00:00.030) 0:00:36.969 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=7d5bb690-4e09-4ee5-b431-32ba43144542", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:32:38 +0000 (0:00:00.036) 0:00:37.006 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:32:38 +0000 (0:00:00.375) 0:00:37.382 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002646", "end": "2022-06-01 12:32:38.562725", "rc": 0, "start": "2022-06-01 12:32:38.560079" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.366) 0:00:37.748 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002531", "end": "2022-06-01 12:32:38.924301", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:32:38.921770" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.356) 0:00:38.105 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.029) 0:00:38.134 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.030) 0:00:38.164 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.059) 0:00:38.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.036) 0:00:38.260 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.110) 0:00:38.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.072) 0:00:38.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.039) 0:00:38.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.029) 0:00:38.512 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.034) 0:00:38.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.029) 0:00:38.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:32:39 +0000 (0:00:00.030) 0:00:38.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.031) 0:00:38.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.030) 0:00:38.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.029) 0:00:38.698 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.044) 0:00:38.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.024) 0:00:38.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.034) 0:00:38.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.030) 0:00:38.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.029) 0:00:38.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.029) 0:00:38.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.025) 0:00:38.917 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101154.4011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101154.4011216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101154.4011216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.362) 0:00:39.279 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.037) 0:00:39.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.024) 0:00:39.342 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.031) 0:00:39.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.030) 0:00:39.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.024) 0:00:39.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.028) 0:00:39.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.028) 0:00:39.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.027) 0:00:39.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.024) 0:00:39.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.031) 0:00:39.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:32:40 +0000 (0:00:00.029) 0:00:39.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.028) 0:00:39.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.028) 0:00:39.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.028) 0:00:39.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.035) 0:00:39.718 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.035) 0:00:39.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.028) 0:00:39.782 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.028) 0:00:39.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.029) 0:00:39.840 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:39.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.028) 0:00:39.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.031) 0:00:39.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.031) 0:00:39.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:39.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:40.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:40.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.029) 0:00:40.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.032) 0:00:40.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.029) 0:00:40.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:40.175 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.032) 0:00:40.208 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.032) 0:00:40.241 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:40.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.034) 0:00:40.306 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.032) 0:00:40.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.031) 0:00:40.370 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.034) 0:00:40.404 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.031) 0:00:40.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.028) 0:00:40.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.031) 0:00:40.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.029) 0:00:40.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.029) 0:00:40.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:40.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:32:41 +0000 (0:00:00.030) 0:00:40.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:32:42 +0000 (0:00:00.029) 0:00:40.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:32:42 +0000 (0:00:00.062) 0:00:40.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:32:42 +0000 (0:00:00.029) 0:00:40.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=202 changed=4 unreachable=0 failed=0 skipped=177 rescued=0 ignored=0 Wednesday 01 June 2022 16:32:42 +0000 (0:00:00.015) 0:00:40.753 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.30s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.25s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove_scsi_generated.yml:3 ----- linux-system-roles.storage : make sure blivet is available -------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.80s /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml:2 -------------------- linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:32:42 +0000 (0:00:00.035) 0:00:00.035 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:32:44 +0000 (0:00:01.262) 0:00:01.298 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lv_size_equal_to_vg.yml ********************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:2 Wednesday 01 June 2022 16:32:44 +0000 (0:00:00.011) 0:00:01.309 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:16 Wednesday 01 June 2022 16:32:45 +0000 (0:00:01.073) 0:00:02.383 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:32:45 +0000 (0:00:00.046) 0:00:02.429 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:32:45 +0000 (0:00:00.152) 0:00:02.582 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:32:45 +0000 (0:00:00.525) 0:00:03.107 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:32:46 +0000 (0:00:00.075) 0:00:03.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:32:46 +0000 (0:00:00.023) 0:00:03.206 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:32:46 +0000 (0:00:00.021) 0:00:03.228 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:32:46 +0000 (0:00:00.187) 0:00:03.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:32:46 +0000 (0:00:00.019) 0:00:03.435 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:32:47 +0000 (0:00:01.073) 0:00:04.508 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:32:47 +0000 (0:00:00.045) 0:00:04.554 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:32:47 +0000 (0:00:00.044) 0:00:04.598 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:32:48 +0000 (0:00:00.711) 0:00:05.310 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:32:48 +0000 (0:00:00.079) 0:00:05.390 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:32:48 +0000 (0:00:00.020) 0:00:05.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:32:48 +0000 (0:00:00.021) 0:00:05.432 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:32:48 +0000 (0:00:00.019) 0:00:05.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:32:49 +0000 (0:00:00.790) 0:00:06.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:32:50 +0000 (0:00:01.803) 0:00:08.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:32:50 +0000 (0:00:00.043) 0:00:08.088 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:32:50 +0000 (0:00:00.027) 0:00:08.116 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.513) 0:00:08.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.030) 0:00:08.660 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.057) 0:00:08.717 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.032) 0:00:08.750 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.031) 0:00:08.782 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.033) 0:00:08.815 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.027) 0:00:08.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.029) 0:00:08.873 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.030) 0:00:08.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:32:51 +0000 (0:00:00.029) 0:00:08.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:32:52 +0000 (0:00:00.453) 0:00:09.386 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:32:52 +0000 (0:00:00.028) 0:00:09.415 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:19 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.815) 0:00:10.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:26 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.031) 0:00:10.261 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.045) 0:00:10.306 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.511) 0:00:10.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.035) 0:00:10.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.029) 0:00:10.883 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create one lv which size is equal to vg size] **************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:31 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.030) 0:00:10.913 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.051) 0:00:10.965 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:32:53 +0000 (0:00:00.039) 0:00:11.004 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.612) 0:00:11.617 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.068) 0:00:11.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.029) 0:00:11.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.027) 0:00:11.743 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.060) 0:00:11.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.026) 0:00:11.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.029) 0:00:11.860 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "10g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.035) 0:00:11.896 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.064) 0:00:11.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.029) 0:00:11.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.029) 0:00:12.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.028) 0:00:12.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.028) 0:00:12.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:32:54 +0000 (0:00:00.040) 0:00:12.116 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:32:55 +0000 (0:00:00.028) 0:00:12.145 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:32:56 +0000 (0:00:01.718) 0:00:13.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:32:56 +0000 (0:00:00.029) 0:00:13.893 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:32:56 +0000 (0:00:00.027) 0:00:13.921 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:32:56 +0000 (0:00:00.040) 0:00:13.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:32:56 +0000 (0:00:00.037) 0:00:13.999 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:32:56 +0000 (0:00:00.033) 0:00:14.032 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:32:56 +0000 (0:00:00.027) 0:00:14.060 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:32:57 +0000 (0:00:00.953) 0:00:15.014 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:32:58 +0000 (0:00:00.578) 0:00:15.593 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:32:59 +0000 (0:00:00.644) 0:00:16.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:32:59 +0000 (0:00:00.366) 0:00:16.604 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:32:59 +0000 (0:00:00.031) 0:00:16.635 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:43 Wednesday 01 June 2022 16:33:00 +0000 (0:00:00.846) 0:00:17.482 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:33:00 +0000 (0:00:00.052) 0:00:17.535 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:33:00 +0000 (0:00:00.041) 0:00:17.576 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:33:00 +0000 (0:00:00.072) 0:00:17.649 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "10G", "type": "lvm", "uuid": "1816405e-67a9-4bf0-b6e6-09937e8ddf56" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "wrqVUm-hxXh-Ecs2-ymPc-8H4f-icmu-jt88R1" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:33:00 +0000 (0:00:00.478) 0:00:18.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002723", "end": "2022-06-01 12:33:00.892400", "rc": 0, "start": "2022-06-01 12:33:00.889677" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:33:01 +0000 (0:00:00.465) 0:00:18.592 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003252", "end": "2022-06-01 12:33:01.291728", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:33:01.288476" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:33:01 +0000 (0:00:00.402) 0:00:18.994 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:33:01 +0000 (0:00:00.062) 0:00:19.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:33:01 +0000 (0:00:00.032) 0:00:19.089 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.062) 0:00:19.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.039) 0:00:19.191 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.475) 0:00:19.667 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.048) 0:00:19.715 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.040) 0:00:19.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.037) 0:00:19.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.037) 0:00:19.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.029) 0:00:19.861 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.041) 0:00:19.903 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.058) 0:00:19.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.031) 0:00:19.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.037) 0:00:20.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.032) 0:00:20.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.032) 0:00:20.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:33:02 +0000 (0:00:00.030) 0:00:20.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.030) 0:00:20.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.030) 0:00:20.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.032) 0:00:20.220 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.060) 0:00:20.280 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.060) 0:00:20.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.030) 0:00:20.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.031) 0:00:20.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.029) 0:00:20.432 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.060) 0:00:20.492 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.033) 0:00:20.526 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.034) 0:00:20.560 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.085) 0:00:20.646 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.035) 0:00:20.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.036) 0:00:20.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.030) 0:00:20.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.031) 0:00:20.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.029) 0:00:20.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.030) 0:00:20.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.029) 0:00:20.869 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.065) 0:00:20.935 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.063) 0:00:20.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.030) 0:00:21.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.031) 0:00:21.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.032) 0:00:21.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:33:03 +0000 (0:00:00.031) 0:00:21.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.030) 0:00:21.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.029) 0:00:21.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.030) 0:00:21.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.032) 0:00:21.249 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.032) 0:00:21.282 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.058) 0:00:21.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.034) 0:00:21.375 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.125) 0:00:21.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.035) 0:00:21.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2591341, "block_size": 4096, "block_total": 2617856, "block_used": 26515, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 5240829, "inode_total": 5240832, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10614132736, "size_total": 10722738176, "uuid": "1816405e-67a9-4bf0-b6e6-09937e8ddf56" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2591341, "block_size": 4096, "block_total": 2617856, "block_used": 26515, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 5240829, "inode_total": 5240832, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10614132736, "size_total": 10722738176, "uuid": "1816405e-67a9-4bf0-b6e6-09937e8ddf56" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.041) 0:00:21.578 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.037) 0:00:21.616 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.034) 0:00:21.650 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.035) 0:00:21.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.030) 0:00:21.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.029) 0:00:21.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.028) 0:00:21.775 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.031) 0:00:21.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.045) 0:00:21.852 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.032) 0:00:21.885 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.036) 0:00:21.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.028) 0:00:21.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.029) 0:00:21.981 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.035) 0:00:22.016 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:33:04 +0000 (0:00:00.035) 0:00:22.051 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101176.0941215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101176.0941215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 3237, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101176.0941215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.408) 0:00:22.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.037) 0:00:22.497 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.037) 0:00:22.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.033) 0:00:22.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.029) 0:00:22.598 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.035) 0:00:22.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.029) 0:00:22.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.029) 0:00:22.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.032) 0:00:22.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.073) 0:00:22.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.030) 0:00:22.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.035) 0:00:22.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.033) 0:00:22.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.030) 0:00:22.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.031) 0:00:22.960 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.042) 0:00:23.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.035) 0:00:23.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.032) 0:00:23.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.031) 0:00:23.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:33:05 +0000 (0:00:00.029) 0:00:23.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.030) 0:00:23.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.032) 0:00:23.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.031) 0:00:23.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.029) 0:00:23.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.030) 0:00:23.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.030) 0:00:23.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.030) 0:00:23.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.033) 0:00:23.379 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:33:06 +0000 (0:00:00.476) 0:00:23.856 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.375) 0:00:24.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "10737418240" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.039) 0:00:24.270 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.034) 0:00:24.305 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.030) 0:00:24.336 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.031) 0:00:24.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.032) 0:00:24.400 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.030) 0:00:24.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.029) 0:00:24.460 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.033) 0:00:24.493 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.032) 0:00:24.525 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.038) 0:00:24.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037161", "end": "2022-06-01 12:33:07.285442", "rc": 0, "start": "2022-06-01 12:33:07.248281" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.424) 0:00:24.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.047) 0:00:25.036 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.041) 0:00:25.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:33:07 +0000 (0:00:00.033) 0:00:25.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.032) 0:00:25.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.031) 0:00:25.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.031) 0:00:25.206 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.033) 0:00:25.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.030) 0:00:25.271 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.027) 0:00:25.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:45 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.030) 0:00:25.328 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.066) 0:00:25.395 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.045) 0:00:25.441 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.535) 0:00:25.976 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:33:08 +0000 (0:00:00.129) 0:00:26.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.032) 0:00:26.138 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.031) 0:00:26.170 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.062) 0:00:26.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.028) 0:00:26.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.030) 0:00:26.290 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.035) 0:00:26.326 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.032) 0:00:26.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.030) 0:00:26.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.030) 0:00:26.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.032) 0:00:26.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.029) 0:00:26.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.043) 0:00:26.526 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:33:09 +0000 (0:00:00.029) 0:00:26.555 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:33:11 +0000 (0:00:01.876) 0:00:28.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:33:11 +0000 (0:00:00.032) 0:00:28.465 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:33:11 +0000 (0:00:00.029) 0:00:28.495 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:33:11 +0000 (0:00:00.039) 0:00:28.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:33:11 +0000 (0:00:00.036) 0:00:28.571 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:33:11 +0000 (0:00:00.035) 0:00:28.606 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:33:11 +0000 (0:00:00.402) 0:00:29.008 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:33:12 +0000 (0:00:00.704) 0:00:29.713 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:33:12 +0000 (0:00:00.031) 0:00:29.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:33:13 +0000 (0:00:00.664) 0:00:30.408 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:33:13 +0000 (0:00:00.363) 0:00:30.772 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:33:13 +0000 (0:00:00.030) 0:00:30.802 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:57 Wednesday 01 June 2022 16:33:14 +0000 (0:00:00.825) 0:00:31.627 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:33:14 +0000 (0:00:00.054) 0:00:31.682 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:33:14 +0000 (0:00:00.040) 0:00:31.722 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:33:14 +0000 (0:00:00.064) 0:00:31.786 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:33:15 +0000 (0:00:00.382) 0:00:32.169 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002982", "end": "2022-06-01 12:33:14.854244", "rc": 0, "start": "2022-06-01 12:33:14.851262" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:33:15 +0000 (0:00:00.384) 0:00:32.553 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002507", "end": "2022-06-01 12:33:15.233119", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:33:15.230612" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:33:15 +0000 (0:00:00.378) 0:00:32.932 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:33:15 +0000 (0:00:00.064) 0:00:32.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:33:15 +0000 (0:00:00.031) 0:00:33.028 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:33:15 +0000 (0:00:00.062) 0:00:33.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:33:15 +0000 (0:00:00.039) 0:00:33.130 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.027) 0:00:33.157 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.027) 0:00:33.185 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.035) 0:00:33.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.034) 0:00:33.255 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.034) 0:00:33.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.031) 0:00:33.321 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.027) 0:00:33.349 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.054) 0:00:33.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.029) 0:00:33.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.030) 0:00:33.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.029) 0:00:33.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.029) 0:00:33.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.028) 0:00:33.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.029) 0:00:33.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.028) 0:00:33.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.034) 0:00:33.643 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.059) 0:00:33.703 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.060) 0:00:33.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.038) 0:00:33.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.033) 0:00:33.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.034) 0:00:33.869 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.065) 0:00:33.934 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.037) 0:00:33.971 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.030) 0:00:34.002 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.028) 0:00:34.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.030) 0:00:34.061 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:33:16 +0000 (0:00:00.060) 0:00:34.121 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.064) 0:00:34.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.030) 0:00:34.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.028) 0:00:34.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.070) 0:00:34.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.494 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.059) 0:00:34.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.033) 0:00:34.586 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.116) 0:00:34.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.033) 0:00:34.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.041) 0:00:34.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.031) 0:00:34.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.035) 0:00:34.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.032) 0:00:34.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.029) 0:00:34.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.034) 0:00:35.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.063) 0:00:35.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.027) 0:00:35.090 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:33:17 +0000 (0:00:00.036) 0:00:35.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.030) 0:00:35.158 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.031) 0:00:35.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.030) 0:00:35.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.029) 0:00:35.249 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.370) 0:00:35.620 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.036) 0:00:35.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.027) 0:00:35.684 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.033) 0:00:35.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.030) 0:00:35.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.025) 0:00:35.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.029) 0:00:35.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.033) 0:00:35.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.030) 0:00:35.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.026) 0:00:35.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.029) 0:00:35.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.028) 0:00:35.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.029) 0:00:35.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.032) 0:00:36.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.030) 0:00:36.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.037) 0:00:36.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:33:18 +0000 (0:00:00.036) 0:00:36.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.029) 0:00:36.209 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.028) 0:00:36.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.028) 0:00:36.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.032) 0:00:36.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.542 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.069) 0:00:36.612 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.031) 0:00:36.643 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.705 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.029) 0:00:36.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.029) 0:00:36.764 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.036) 0:00:36.800 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.034) 0:00:36.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.029) 0:00:36.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:36.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.033) 0:00:36.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.032) 0:00:37.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.031) 0:00:37.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.032) 0:00:37.083 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:33:19 +0000 (0:00:00.030) 0:00:37.114 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:33:20 +0000 (0:00:00.029) 0:00:37.143 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:33:20 +0000 (0:00:00.029) 0:00:37.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=191 changed=4 unreachable=0 failed=0 skipped=163 rescued=0 ignored=0 Wednesday 01 June 2022 16:33:20 +0000 (0:00:00.014) 0:00:37.188 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:2 ----------------- linux-system-roles.storage : make sure blivet is available -------------- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.61s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : set up new/current mounts ------------------ 0.58s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:33:20 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:33:22 +0000 (0:00:01.275) 0:00:01.298 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lv_size_equal_to_vg_nvme_generated.yml ****************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:33:22 +0000 (0:00:00.015) 0:00:01.313 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:33:22 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:33:24 +0000 (0:00:01.499) 0:00:01.522 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.50s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lv_size_equal_to_vg_scsi_generated.yml ****************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg_scsi_generated.yml:3 Wednesday 01 June 2022 16:33:24 +0000 (0:00:00.013) 0:00:01.536 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg_scsi_generated.yml:7 Wednesday 01 June 2022 16:33:25 +0000 (0:00:01.061) 0:00:02.598 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:2 Wednesday 01 June 2022 16:33:25 +0000 (0:00:00.024) 0:00:02.622 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:16 Wednesday 01 June 2022 16:33:26 +0000 (0:00:00.823) 0:00:03.446 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:33:26 +0000 (0:00:00.037) 0:00:03.484 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:33:26 +0000 (0:00:00.153) 0:00:03.637 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:33:26 +0000 (0:00:00.532) 0:00:04.170 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:33:27 +0000 (0:00:00.075) 0:00:04.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:33:27 +0000 (0:00:00.020) 0:00:04.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:33:27 +0000 (0:00:00.021) 0:00:04.288 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:33:27 +0000 (0:00:00.189) 0:00:04.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:33:27 +0000 (0:00:00.019) 0:00:04.497 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:33:28 +0000 (0:00:01.067) 0:00:05.564 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:33:28 +0000 (0:00:00.045) 0:00:05.609 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:33:28 +0000 (0:00:00.044) 0:00:05.654 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:33:29 +0000 (0:00:00.657) 0:00:06.311 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:33:29 +0000 (0:00:00.077) 0:00:06.389 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:33:29 +0000 (0:00:00.020) 0:00:06.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:33:29 +0000 (0:00:00.020) 0:00:06.431 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:33:29 +0000 (0:00:00.018) 0:00:06.450 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:33:30 +0000 (0:00:00.788) 0:00:07.238 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:33:31 +0000 (0:00:01.731) 0:00:08.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:33:31 +0000 (0:00:00.042) 0:00:09.012 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:33:31 +0000 (0:00:00.027) 0:00:09.039 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.522) 0:00:09.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.032) 0:00:09.594 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.027) 0:00:09.621 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.032) 0:00:09.653 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.030) 0:00:09.684 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.034) 0:00:09.718 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.030) 0:00:09.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.032) 0:00:09.781 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.026) 0:00:09.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:33:32 +0000 (0:00:00.027) 0:00:09.835 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:33:33 +0000 (0:00:00.506) 0:00:10.341 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:33:33 +0000 (0:00:00.028) 0:00:10.369 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:19 Wednesday 01 June 2022 16:33:33 +0000 (0:00:00.804) 0:00:11.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:26 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.030) 0:00:11.204 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.044) 0:00:11.248 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.510) 0:00:11.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.035) 0:00:11.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.027) 0:00:11.823 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create one lv which size is equal to vg size] **************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:31 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.031) 0:00:11.854 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.053) 0:00:11.908 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:33:34 +0000 (0:00:00.070) 0:00:11.978 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.520) 0:00:12.498 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.068) 0:00:12.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.030) 0:00:12.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.030) 0:00:12.628 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.060) 0:00:12.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.024) 0:00:12.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.030) 0:00:12.743 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "10g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.035) 0:00:12.779 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.033) 0:00:12.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.029) 0:00:12.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.029) 0:00:12.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.029) 0:00:12.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.031) 0:00:12.932 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.041) 0:00:12.973 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:33:35 +0000 (0:00:00.026) 0:00:13.000 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:33:37 +0000 (0:00:01.682) 0:00:14.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:33:37 +0000 (0:00:00.030) 0:00:14.712 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:33:37 +0000 (0:00:00.026) 0:00:14.739 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:33:37 +0000 (0:00:00.037) 0:00:14.777 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:33:37 +0000 (0:00:00.042) 0:00:14.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:33:37 +0000 (0:00:00.044) 0:00:14.865 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:33:37 +0000 (0:00:00.030) 0:00:14.895 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:33:38 +0000 (0:00:00.954) 0:00:15.850 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:33:39 +0000 (0:00:00.560) 0:00:16.410 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:33:39 +0000 (0:00:00.647) 0:00:17.058 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:33:40 +0000 (0:00:00.394) 0:00:17.452 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:33:40 +0000 (0:00:00.028) 0:00:17.481 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:43 Wednesday 01 June 2022 16:33:41 +0000 (0:00:00.868) 0:00:18.349 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:33:41 +0000 (0:00:00.052) 0:00:18.401 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:33:41 +0000 (0:00:00.038) 0:00:18.440 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:33:41 +0000 (0:00:00.028) 0:00:18.469 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "10G", "type": "lvm", "uuid": "e7c5e042-b789-45d4-951e-9b0b8e6e6e9b" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "TUWu6W-5ir5-Cy7Z-m2zR-cX0E-YNIX-Z9P6cl" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:33:41 +0000 (0:00:00.471) 0:00:18.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002477", "end": "2022-06-01 12:33:41.636699", "rc": 0, "start": "2022-06-01 12:33:41.634222" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:33:42 +0000 (0:00:00.455) 0:00:19.396 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003537", "end": "2022-06-01 12:33:42.015437", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:33:42.011900" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:33:42 +0000 (0:00:00.389) 0:00:19.786 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:33:42 +0000 (0:00:00.062) 0:00:19.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:33:42 +0000 (0:00:00.033) 0:00:19.883 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:33:42 +0000 (0:00:00.063) 0:00:19.946 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:33:42 +0000 (0:00:00.039) 0:00:19.985 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.509) 0:00:20.495 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.039) 0:00:20.534 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.038) 0:00:20.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.034) 0:00:20.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.033) 0:00:20.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.030) 0:00:20.671 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.041) 0:00:20.713 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.054) 0:00:20.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.029) 0:00:20.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.058) 0:00:20.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.033) 0:00:20.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.036) 0:00:20.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.030) 0:00:20.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.030) 0:00:20.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.029) 0:00:21.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.034) 0:00:21.050 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.056) 0:00:21.106 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:33:43 +0000 (0:00:00.059) 0:00:21.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.031) 0:00:21.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.028) 0:00:21.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.029) 0:00:21.255 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.060) 0:00:21.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.036) 0:00:21.353 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.033) 0:00:21.386 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.056) 0:00:21.442 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.033) 0:00:21.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.037) 0:00:21.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.030) 0:00:21.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.028) 0:00:21.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.029) 0:00:21.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.031) 0:00:21.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.030) 0:00:21.663 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.065) 0:00:21.729 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.063) 0:00:21.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.031) 0:00:21.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.029) 0:00:21.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.030) 0:00:21.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.030) 0:00:21.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.029) 0:00:21.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.029) 0:00:21.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.032) 0:00:22.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.030) 0:00:22.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.029) 0:00:22.066 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.056) 0:00:22.122 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:33:44 +0000 (0:00:00.034) 0:00:22.157 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.127) 0:00:22.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.032) 0:00:22.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2591341, "block_size": 4096, "block_total": 2617856, "block_used": 26515, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 5240829, "inode_total": 5240832, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10614132736, "size_total": 10722738176, "uuid": "e7c5e042-b789-45d4-951e-9b0b8e6e6e9b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2591341, "block_size": 4096, "block_total": 2617856, "block_used": 26515, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 5240829, "inode_total": 5240832, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10614132736, "size_total": 10722738176, "uuid": "e7c5e042-b789-45d4-951e-9b0b8e6e6e9b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.039) 0:00:22.357 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.036) 0:00:22.393 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.036) 0:00:22.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.043) 0:00:22.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.031) 0:00:22.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.030) 0:00:22.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.029) 0:00:22.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.030) 0:00:22.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.047) 0:00:22.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.081) 0:00:22.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.036) 0:00:22.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.030) 0:00:22.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.033) 0:00:22.826 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.038) 0:00:22.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:33:45 +0000 (0:00:00.041) 0:00:22.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101216.8691216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101216.8691216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 3374, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101216.8691216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.392) 0:00:23.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.040) 0:00:23.339 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.036) 0:00:23.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.033) 0:00:23.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.029) 0:00:23.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.035) 0:00:23.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.029) 0:00:23.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.033) 0:00:23.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.029) 0:00:23.566 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.038) 0:00:23.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.032) 0:00:23.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.032) 0:00:23.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.032) 0:00:23.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.033) 0:00:23.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.031) 0:00:23.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.041) 0:00:23.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.041) 0:00:23.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.030) 0:00:23.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.029) 0:00:23.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.030) 0:00:23.941 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.032) 0:00:23.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.032) 0:00:24.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.029) 0:00:24.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.027) 0:00:24.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.027) 0:00:24.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.034) 0:00:24.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.029) 0:00:24.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:33:46 +0000 (0:00:00.029) 0:00:24.185 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:33:47 +0000 (0:00:00.525) 0:00:24.711 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:33:47 +0000 (0:00:00.381) 0:00:25.092 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "10737418240" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:33:47 +0000 (0:00:00.040) 0:00:25.133 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:33:47 +0000 (0:00:00.035) 0:00:25.168 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.031) 0:00:25.200 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.029) 0:00:25.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.029) 0:00:25.259 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.029) 0:00:25.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.029) 0:00:25.318 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.033) 0:00:25.352 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.034) 0:00:25.387 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.038) 0:00:25.425 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.039145", "end": "2022-06-01 12:33:48.081734", "rc": 0, "start": "2022-06-01 12:33:48.042589" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.415) 0:00:25.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.038) 0:00:25.880 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.039) 0:00:25.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.031) 0:00:25.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.031) 0:00:25.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.032) 0:00:26.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.032) 0:00:26.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.030) 0:00:26.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.031) 0:00:26.108 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.030) 0:00:26.138 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:45 Wednesday 01 June 2022 16:33:48 +0000 (0:00:00.030) 0:00:26.169 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.103) 0:00:26.273 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.044) 0:00:26.318 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.502) 0:00:26.820 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.068) 0:00:26.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.030) 0:00:26.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.028) 0:00:26.948 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.060) 0:00:27.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.025) 0:00:27.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.034) 0:00:27.069 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.038) 0:00:27.108 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.035) 0:00:27.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:33:49 +0000 (0:00:00.037) 0:00:27.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:33:50 +0000 (0:00:00.034) 0:00:27.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:33:50 +0000 (0:00:00.029) 0:00:27.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:33:50 +0000 (0:00:00.030) 0:00:27.276 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:33:50 +0000 (0:00:00.045) 0:00:27.322 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:33:50 +0000 (0:00:00.028) 0:00:27.351 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:33:52 +0000 (0:00:01.860) 0:00:29.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:33:52 +0000 (0:00:00.031) 0:00:29.243 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:33:52 +0000 (0:00:00.028) 0:00:29.272 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:33:52 +0000 (0:00:00.040) 0:00:29.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:33:52 +0000 (0:00:00.038) 0:00:29.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:33:52 +0000 (0:00:00.037) 0:00:29.388 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:33:52 +0000 (0:00:00.391) 0:00:29.779 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:33:53 +0000 (0:00:00.649) 0:00:30.429 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:33:53 +0000 (0:00:00.030) 0:00:30.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:33:53 +0000 (0:00:00.628) 0:00:31.088 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:33:54 +0000 (0:00:00.369) 0:00:31.457 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:33:54 +0000 (0:00:00.029) 0:00:31.487 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:57 Wednesday 01 June 2022 16:33:55 +0000 (0:00:00.919) 0:00:32.406 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:33:55 +0000 (0:00:00.062) 0:00:32.468 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10733223936, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:33:55 +0000 (0:00:00.044) 0:00:32.513 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:33:55 +0000 (0:00:00.031) 0:00:32.545 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:33:55 +0000 (0:00:00.378) 0:00:32.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003311", "end": "2022-06-01 12:33:55.557653", "rc": 0, "start": "2022-06-01 12:33:55.554342" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.396) 0:00:33.320 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002783", "end": "2022-06-01 12:33:55.934463", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:33:55.931680" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.374) 0:00:33.695 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.063) 0:00:33.758 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.031) 0:00:33.790 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.069) 0:00:33.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.039) 0:00:33.899 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.028) 0:00:33.927 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.029) 0:00:33.957 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.037) 0:00:33.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.038) 0:00:34.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.037) 0:00:34.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.031) 0:00:34.102 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.028) 0:00:34.131 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:33:56 +0000 (0:00:00.055) 0:00:34.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.030) 0:00:34.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.029) 0:00:34.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.027) 0:00:34.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.027) 0:00:34.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.028) 0:00:34.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.029) 0:00:34.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.030) 0:00:34.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.029) 0:00:34.418 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.056) 0:00:34.475 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.058) 0:00:34.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.029) 0:00:34.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.027) 0:00:34.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.028) 0:00:34.619 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.064) 0:00:34.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.072) 0:00:34.755 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.028) 0:00:34.784 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.028) 0:00:34.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.028) 0:00:34.841 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.061) 0:00:34.903 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.062) 0:00:34.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.029) 0:00:34.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.029) 0:00:35.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.032) 0:00:35.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.033) 0:00:35.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.032) 0:00:35.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.031) 0:00:35.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:33:57 +0000 (0:00:00.029) 0:00:35.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.030) 0:00:35.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.032) 0:00:35.248 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.056) 0:00:35.305 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.034) 0:00:35.339 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.135) 0:00:35.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.039) 0:00:35.514 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.040) 0:00:35.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.030) 0:00:35.585 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.035) 0:00:35.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.029) 0:00:35.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.028) 0:00:35.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.028) 0:00:35.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.027) 0:00:35.735 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.030) 0:00:35.766 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.043) 0:00:35.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.024) 0:00:35.834 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.036) 0:00:35.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.029) 0:00:35.900 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.029) 0:00:35.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.029) 0:00:35.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:33:58 +0000 (0:00:00.024) 0:00:35.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.382) 0:00:36.367 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.036) 0:00:36.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.026) 0:00:36.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.033) 0:00:36.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.029) 0:00:36.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.026) 0:00:36.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.029) 0:00:36.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.028) 0:00:36.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.027) 0:00:36.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.025) 0:00:36.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.029) 0:00:36.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.031) 0:00:36.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.028) 0:00:36.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.030) 0:00:36.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.029) 0:00:36.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.035) 0:00:36.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.035) 0:00:36.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.031) 0:00:36.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.076) 0:00:36.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.030) 0:00:36.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.028) 0:00:37.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.030) 0:00:37.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.030) 0:00:37.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.031) 0:00:37.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.032) 0:00:37.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:33:59 +0000 (0:00:00.031) 0:00:37.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.031) 0:00:37.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.032) 0:00:37.331 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.032) 0:00:37.364 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.394 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.455 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.029) 0:00:37.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.035) 0:00:37.520 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.035) 0:00:37.555 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.033) 0:00:37.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.028) 0:00:37.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.032) 0:00:37.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.032) 0:00:37.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.031) 0:00:37.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.030) 0:00:37.835 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.031) 0:00:37.867 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.033) 0:00:37.901 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.028) 0:00:37.929 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=193 changed=4 unreachable=0 failed=0 skipped=163 rescued=0 ignored=0 Wednesday 01 June 2022 16:34:00 +0000 (0:00:00.016) 0:00:37.945 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.50s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.06s /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg_scsi_generated.yml:3 -- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml:2 ----------------- linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.63s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : set up new/current mounts ------------------ 0.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- parse the actual size of the volume ------------------------------------- 0.53s /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 -------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:34:01 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:34:02 +0000 (0:00:01.273) 0:00:01.296 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvm_cache_then_remove.yml ******************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:2 Wednesday 01 June 2022 16:34:02 +0000 (0:00:00.013) 0:00:01.310 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:14 Wednesday 01 June 2022 16:34:03 +0000 (0:00:01.090) 0:00:02.401 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:34:03 +0000 (0:00:00.038) 0:00:02.439 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:34:04 +0000 (0:00:00.156) 0:00:02.596 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:34:04 +0000 (0:00:00.779) 0:00:03.375 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:34:04 +0000 (0:00:00.075) 0:00:03.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:34:04 +0000 (0:00:00.025) 0:00:03.476 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:34:05 +0000 (0:00:00.022) 0:00:03.499 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:34:05 +0000 (0:00:00.196) 0:00:03.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:34:05 +0000 (0:00:00.019) 0:00:03.714 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:34:06 +0000 (0:00:01.052) 0:00:04.767 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:34:06 +0000 (0:00:00.048) 0:00:04.815 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:34:06 +0000 (0:00:00.044) 0:00:04.860 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:34:07 +0000 (0:00:00.673) 0:00:05.533 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:34:07 +0000 (0:00:00.080) 0:00:05.614 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:34:07 +0000 (0:00:00.020) 0:00:05.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:34:07 +0000 (0:00:00.021) 0:00:05.656 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:34:07 +0000 (0:00:00.020) 0:00:05.677 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:34:08 +0000 (0:00:00.849) 0:00:06.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:34:09 +0000 (0:00:01.775) 0:00:08.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:34:09 +0000 (0:00:00.043) 0:00:08.345 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:34:09 +0000 (0:00:00.057) 0:00:08.403 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.527) 0:00:08.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.029) 0:00:08.960 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.026) 0:00:08.987 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.033) 0:00:09.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.033) 0:00:09.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.032) 0:00:09.085 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.027) 0:00:09.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.028) 0:00:09.141 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.025) 0:00:09.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:34:10 +0000 (0:00:00.026) 0:00:09.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:34:11 +0000 (0:00:00.464) 0:00:09.658 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:34:11 +0000 (0:00:00.027) 0:00:09.685 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:17 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.809) 0:00:10.495 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:24 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.029) 0:00:10.524 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.041) 0:00:10.565 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.519) 0:00:11.084 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.036) 0:00:11.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.030) 0:00:11.151 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a cached LVM logical volume under volume group 'foo'] ************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:30 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.033) 0:00:11.184 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.057) 0:00:11.242 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:34:12 +0000 (0:00:00.043) 0:00:11.285 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.525) 0:00:11.811 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.068) 0:00:11.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.029) 0:00:11.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.029) 0:00:11.938 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.059) 0:00:11.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.026) 0:00:12.024 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.058) 0:00:12.083 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "foo", "volumes": [ { "cache_devices": [ "sdb" ], "cache_size": "4g", "cached": true, "name": "test", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.036) 0:00:12.119 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.030) 0:00:12.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.028) 0:00:12.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.028) 0:00:12.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.028) 0:00:12.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.029) 0:00:12.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.043) 0:00:12.307 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:34:13 +0000 (0:00:00.027) 0:00:12.334 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:34:17 +0000 (0:00:03.461) 0:00:15.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.029) 0:00:15.825 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.028) 0:00:15.853 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.046) 0:00:15.900 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.041) 0:00:15.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.036) 0:00:15.978 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.028) 0:00:16.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.029) 0:00:16.036 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.029) 0:00:16.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.031) 0:00:16.097 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:34:17 +0000 (0:00:00.367) 0:00:16.464 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:34:18 +0000 (0:00:00.029) 0:00:16.494 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:44 Wednesday 01 June 2022 16:34:18 +0000 (0:00:00.864) 0:00:17.359 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:34:18 +0000 (0:00:00.053) 0:00:17.413 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:34:18 +0000 (0:00:00.044) 0:00:17.458 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:34:19 +0000 (0:00:00.031) 0:00:17.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test", "size": "5G", "type": "lvm", "uuid": "5d0695aa-2a73-4dd3-9c6b-5fdfb80dcfdb" }, "/dev/mapper/foo-test_cache_cpool_cdata": { "fstype": "", "label": "", "name": "/dev/mapper/foo-test_cache_cpool_cdata", "size": "4G", "type": "lvm", "uuid": "" }, "/dev/mapper/foo-test_cache_cpool_cmeta": { "fstype": "", "label": "", "name": "/dev/mapper/foo-test_cache_cpool_cmeta", "size": "8M", "type": "lvm", "uuid": "" }, "/dev/mapper/foo-test_corig": { "fstype": "", "label": "", "name": "/dev/mapper/foo-test_corig", "size": "5G", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "8C9Nj1-R1Wv-Cu8w-GBCz-D2BR-Zv4P-NJCzuE" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "ZgTugA-2giu-QcY3-h33E-4uQ0-5bcQ-2kmICv" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:34:19 +0000 (0:00:00.528) 0:00:18.017 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002669", "end": "2022-06-01 12:34:19.465261", "rc": 0, "start": "2022-06-01 12:34:19.462592" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:34:20 +0000 (0:00:00.504) 0:00:18.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002747", "end": "2022-06-01 12:34:19.841179", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:34:19.838432" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:34:20 +0000 (0:00:00.372) 0:00:18.895 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:34:20 +0000 (0:00:00.102) 0:00:18.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:34:20 +0000 (0:00:00.032) 0:00:19.030 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:34:20 +0000 (0:00:00.064) 0:00:19.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:34:20 +0000 (0:00:00.043) 0:00:19.138 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.800) 0:00:19.938 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.053) 0:00:19.991 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.037) 0:00:20.029 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.037) 0:00:20.066 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.037) 0:00:20.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.031) 0:00:20.136 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.052) 0:00:20.189 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.057) 0:00:20.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.029) 0:00:20.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.028) 0:00:20.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.029) 0:00:20.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.028) 0:00:20.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.030) 0:00:20.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.031) 0:00:20.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:34:21 +0000 (0:00:00.033) 0:00:20.459 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.032) 0:00:20.491 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.062) 0:00:20.553 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.061) 0:00:20.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.032) 0:00:20.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.030) 0:00:20.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.030) 0:00:20.708 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.062) 0:00:20.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.037) 0:00:20.808 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.038) 0:00:20.847 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.066) 0:00:20.913 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.034) 0:00:20.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.038) 0:00:20.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.032) 0:00:21.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.032) 0:00:21.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.035) 0:00:21.086 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.032) 0:00:21.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.036) 0:00:21.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.040) 0:00:21.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.030) 0:00:21.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.029) 0:00:21.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.030) 0:00:21.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.032) 0:00:21.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.031) 0:00:21.350 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:34:22 +0000 (0:00:00.113) 0:00:21.464 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.069) 0:00:21.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.033) 0:00:21.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.032) 0:00:21.600 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.032) 0:00:21.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.030) 0:00:21.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.030) 0:00:21.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.031) 0:00:21.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.034) 0:00:21.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.030) 0:00:21.791 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.030) 0:00:21.822 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.060) 0:00:21.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.037) 0:00:21.920 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.132) 0:00:22.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.035) 0:00:22.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.039) 0:00:22.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.030) 0:00:22.159 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.033) 0:00:22.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.028) 0:00:22.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.031) 0:00:22.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.031) 0:00:22.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.030) 0:00:22.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.034) 0:00:22.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.046) 0:00:22.397 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.035) 0:00:22.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:34:23 +0000 (0:00:00.038) 0:00:22.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.031) 0:00:22.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.032) 0:00:22.534 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.040) 0:00:22.575 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.038) 0:00:22.614 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101256.6781216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101256.6781216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 3696, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101256.6781216, "nlink": 1, "path": "/dev/mapper/foo-test", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.386) 0:00:23.000 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.036) 0:00:23.036 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.042) 0:00:23.079 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.033) 0:00:23.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.038) 0:00:23.151 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.038) 0:00:23.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.029) 0:00:23.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.032) 0:00:23.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.029) 0:00:23.280 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.038) 0:00:23.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.032) 0:00:23.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.030) 0:00:23.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.029) 0:00:23.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.030) 0:00:23.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:34:24 +0000 (0:00:00.030) 0:00:23.472 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.038) 0:00:23.511 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.038) 0:00:23.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.082) 0:00:23.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.032) 0:00:23.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.031) 0:00:23.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.032) 0:00:23.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.034) 0:00:23.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.030) 0:00:23.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.034) 0:00:23.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.031) 0:00:23.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.031) 0:00:23.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.030) 0:00:23.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.031) 0:00:23.952 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:34:25 +0000 (0:00:00.481) 0:00:24.434 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.375) 0:00:24.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.037) 0:00:24.847 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.033) 0:00:24.880 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.029) 0:00:24.910 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.029) 0:00:24.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.030) 0:00:24.970 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.032) 0:00:25.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.033) 0:00:25.036 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.035) 0:00:25.072 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.037) 0:00:25.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:34:26 +0000 (0:00:00.044) 0:00:25.154 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test" ], "delta": "0:00:00.049841", "end": "2022-06-01 12:34:26.540586", "rc": 0, "start": "2022-06-01 12:34:26.490745" } STDOUT: LVM2_LV_NAME=test LVM2_LV_ATTR=Cwi-a-C--- LVM2_CACHE_TOTAL_BLOCKS=65280 LVM2_CHUNK_SIZE=65536 LVM2_SEGTYPE=cache TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.444) 0:00:25.598 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "cache" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.039) 0:00:25.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.040) 0:00:25.678 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_cache_size": [ "65280", "65536" ] }, "changed": false } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.040) 0:00:25.718 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.374) 0:00:26.093 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_cache_size": "4294967296" }, "changed": false } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.039) 0:00:26.133 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.043) 0:00:26.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.031) 0:00:26.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.030) 0:00:26.238 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.027) 0:00:26.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove (detach) cache from the 'test' LV created above] ****************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:46 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.033) 0:00:26.299 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.068) 0:00:26.367 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:34:27 +0000 (0:00:00.053) 0:00:26.421 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.541) 0:00:26.963 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.071) 0:00:27.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.031) 0:00:27.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.030) 0:00:27.098 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.062) 0:00:27.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.026) 0:00:27.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.030) 0:00:27.217 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "foo", "volumes": [ { "cached": false, "name": "test", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.037) 0:00:27.255 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.035) 0:00:27.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.030) 0:00:27.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.031) 0:00:27.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.031) 0:00:27.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.030) 0:00:27.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:34:28 +0000 (0:00:00.048) 0:00:27.464 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:34:29 +0000 (0:00:00.029) 0:00:27.494 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test_cache", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/foo-test", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:34:34 +0000 (0:00:05.281) 0:00:32.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.033) 0:00:32.809 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.029) 0:00:32.838 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test_cache", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/foo-test", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.039) 0:00:32.878 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.036) 0:00:32.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.035) 0:00:32.949 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.030) 0:00:32.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.032) 0:00:33.012 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.029) 0:00:33.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.031) 0:00:33.073 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.367) 0:00:33.441 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:34:34 +0000 (0:00:00.031) 0:00:33.472 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:58 Wednesday 01 June 2022 16:34:35 +0000 (0:00:00.932) 0:00:34.405 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:34:35 +0000 (0:00:00.057) 0:00:34.462 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:34:36 +0000 (0:00:00.041) 0:00:34.504 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:34:36 +0000 (0:00:00.066) 0:00:34.571 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test", "size": "5G", "type": "lvm", "uuid": "5d0695aa-2a73-4dd3-9c6b-5fdfb80dcfdb" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "8C9Nj1-R1Wv-Cu8w-GBCz-D2BR-Zv4P-NJCzuE" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "ZgTugA-2giu-QcY3-h33E-4uQ0-5bcQ-2kmICv" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:34:36 +0000 (0:00:00.372) 0:00:34.944 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003135", "end": "2022-06-01 12:34:36.270479", "rc": 0, "start": "2022-06-01 12:34:36.267344" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:34:36 +0000 (0:00:00.384) 0:00:35.328 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002903", "end": "2022-06-01 12:34:36.663872", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:34:36.660969" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:34:37 +0000 (0:00:00.393) 0:00:35.721 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:34:37 +0000 (0:00:00.063) 0:00:35.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:34:37 +0000 (0:00:00.030) 0:00:35.815 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:34:37 +0000 (0:00:00.061) 0:00:35.876 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:34:37 +0000 (0:00:00.039) 0:00:35.916 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.722) 0:00:36.638 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.050) 0:00:36.689 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.037) 0:00:36.727 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.036) 0:00:36.763 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.035) 0:00:36.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.030) 0:00:36.829 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.052) 0:00:36.881 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.065) 0:00:36.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.034) 0:00:36.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.032) 0:00:37.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.033) 0:00:37.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.031) 0:00:37.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.030) 0:00:37.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.030) 0:00:37.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.031) 0:00:37.172 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.031) 0:00:37.204 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.061) 0:00:37.265 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.061) 0:00:37.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.029) 0:00:37.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.032) 0:00:37.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.030) 0:00:37.419 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:34:38 +0000 (0:00:00.062) 0:00:37.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.037) 0:00:37.519 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.041) 0:00:37.561 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.123) 0:00:37.684 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.036) 0:00:37.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.036) 0:00:37.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.032) 0:00:37.789 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.030) 0:00:37.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.029) 0:00:37.850 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.030) 0:00:37.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.034) 0:00:37.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.034) 0:00:37.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.033) 0:00:37.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.030) 0:00:38.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.030) 0:00:38.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.031) 0:00:38.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.031) 0:00:38.107 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.064) 0:00:38.171 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.063) 0:00:38.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.031) 0:00:38.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.032) 0:00:38.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.031) 0:00:38.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.030) 0:00:38.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.030) 0:00:38.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.031) 0:00:38.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.030) 0:00:38.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:34:39 +0000 (0:00:00.034) 0:00:38.488 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.031) 0:00:38.519 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.058) 0:00:38.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.035) 0:00:38.614 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.124) 0:00:38.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.035) 0:00:38.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.040) 0:00:38.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.031) 0:00:38.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.035) 0:00:38.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.032) 0:00:38.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.032) 0:00:38.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.030) 0:00:38.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.036) 0:00:39.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.037) 0:00:39.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.050) 0:00:39.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.037) 0:00:39.138 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.041) 0:00:39.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.032) 0:00:39.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.032) 0:00:39.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.038) 0:00:39.284 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:34:40 +0000 (0:00:00.037) 0:00:39.321 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101272.4271214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101272.4271214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 3696, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101272.4271214, "nlink": 1, "path": "/dev/mapper/foo-test", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.377) 0:00:39.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.039) 0:00:39.739 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.037) 0:00:39.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.085) 0:00:39.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.031) 0:00:39.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.035) 0:00:39.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.031) 0:00:39.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.029) 0:00:39.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.029) 0:00:40.020 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.040) 0:00:40.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.031) 0:00:40.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.031) 0:00:40.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.031) 0:00:40.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.030) 0:00:40.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.030) 0:00:40.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.040) 0:00:40.257 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.035) 0:00:40.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.030) 0:00:40.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.030) 0:00:40.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.029) 0:00:40.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.032) 0:00:40.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.033) 0:00:40.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:34:41 +0000 (0:00:00.031) 0:00:40.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.035) 0:00:40.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.031) 0:00:40.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.031) 0:00:40.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.031) 0:00:40.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.035) 0:00:40.647 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.368) 0:00:41.016 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.380) 0:00:41.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.039) 0:00:41.436 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:34:42 +0000 (0:00:00.042) 0:00:41.478 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.036) 0:00:41.514 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.035) 0:00:41.550 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.032) 0:00:41.582 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.032) 0:00:41.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.037) 0:00:41.653 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.038) 0:00:41.691 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.034) 0:00:41.726 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.039) 0:00:41.765 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test" ], "delta": "0:00:00.031154", "end": "2022-06-01 12:34:43.121415", "rc": 0, "start": "2022-06-01 12:34:43.090261" } STDOUT: LVM2_LV_NAME=test LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.408) 0:00:42.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.039) 0:00:42.214 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.039) 0:00:42.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.033) 0:00:42.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.032) 0:00:42.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.033) 0:00:42.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.032) 0:00:42.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.031) 0:00:42.419 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.032) 0:00:42.451 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:34:43 +0000 (0:00:00.030) 0:00:42.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:60 Wednesday 01 June 2022 16:34:44 +0000 (0:00:00.032) 0:00:42.514 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:34:44 +0000 (0:00:00.075) 0:00:42.589 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:34:44 +0000 (0:00:00.099) 0:00:42.688 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:34:44 +0000 (0:00:00.689) 0:00:43.378 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:34:44 +0000 (0:00:00.070) 0:00:43.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:34:44 +0000 (0:00:00.031) 0:00:43.480 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.031) 0:00:43.512 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.063) 0:00:43.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.026) 0:00:43.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.031) 0:00:43.633 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "foo", "state": "absent", "volumes": [ { "name": "test", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.038) 0:00:43.671 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.035) 0:00:43.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.035) 0:00:43.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.034) 0:00:43.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.031) 0:00:43.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.033) 0:00:43.842 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.056) 0:00:43.898 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:34:45 +0000 (0:00:00.029) 0:00:43.928 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:34:48 +0000 (0:00:02.718) 0:00:46.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.030) 0:00:46.676 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.027) 0:00:46.704 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.044) 0:00:46.749 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.038) 0:00:46.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.037) 0:00:46.825 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.030) 0:00:46.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.031) 0:00:46.887 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.030) 0:00:46.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.031) 0:00:46.949 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.355) 0:00:47.305 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:34:48 +0000 (0:00:00.029) 0:00:47.335 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:72 Wednesday 01 June 2022 16:34:49 +0000 (0:00:00.878) 0:00:48.213 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:34:49 +0000 (0:00:00.058) 0:00:48.272 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:34:49 +0000 (0:00:00.039) 0:00:48.312 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:34:49 +0000 (0:00:00.030) 0:00:48.343 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:34:50 +0000 (0:00:00.379) 0:00:48.722 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002875", "end": "2022-06-01 12:34:50.042114", "rc": 0, "start": "2022-06-01 12:34:50.039239" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:34:50 +0000 (0:00:00.379) 0:00:49.102 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002866", "end": "2022-06-01 12:34:50.419776", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:34:50.416910" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:34:50 +0000 (0:00:00.371) 0:00:49.473 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.064) 0:00:49.537 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.029) 0:00:49.567 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.059) 0:00:49.626 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.037) 0:00:49.664 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.027) 0:00:49.692 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.028) 0:00:49.721 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.036) 0:00:49.757 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.037) 0:00:49.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.034) 0:00:49.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.029) 0:00:49.858 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.026) 0:00:49.885 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.064) 0:00:49.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.032) 0:00:49.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.030) 0:00:50.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.031) 0:00:50.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.033) 0:00:50.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.030) 0:00:50.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.031) 0:00:50.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.030) 0:00:50.170 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.032) 0:00:50.202 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.060) 0:00:50.263 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.058) 0:00:50.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.031) 0:00:50.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.030) 0:00:50.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.029) 0:00:50.412 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:34:51 +0000 (0:00:00.062) 0:00:50.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.037) 0:00:50.512 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.028) 0:00:50.541 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.028) 0:00:50.569 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.032) 0:00:50.602 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.061) 0:00:50.664 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.063) 0:00:50.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.032) 0:00:50.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.030) 0:00:50.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.030) 0:00:50.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.030) 0:00:50.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.030) 0:00:50.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.030) 0:00:50.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.079) 0:00:50.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.034) 0:00:51.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.036) 0:00:51.063 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.061) 0:00:51.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.035) 0:00:51.160 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.119) 0:00:51.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.038) 0:00:51.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.045) 0:00:51.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.031) 0:00:51.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.034) 0:00:51.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:34:52 +0000 (0:00:00.030) 0:00:51.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:51.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.032) 0:00:51.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:51.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:51.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.048) 0:00:51.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.026) 0:00:51.657 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.036) 0:00:51.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.031) 0:00:51.725 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:51.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:51.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.024) 0:00:51.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.372) 0:00:52.181 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.037) 0:00:52.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.032) 0:00:52.252 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.040) 0:00:52.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.032) 0:00:52.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.025) 0:00:52.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:52.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:52.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.029) 0:00:52.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:34:53 +0000 (0:00:00.026) 0:00:52.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.032) 0:00:52.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.028) 0:00:52.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.037) 0:00:52.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.037) 0:00:52.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.031) 0:00:52.783 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.032) 0:00:52.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.033) 0:00:52.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.030) 0:00:52.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:52.999 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.030) 0:00:53.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.031) 0:00:53.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:53.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.028) 0:00:53.119 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.030) 0:00:53.150 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:53.179 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.029) 0:00:53.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.080) 0:00:53.290 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.031) 0:00:53.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.030) 0:00:53.352 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.033) 0:00:53.385 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.030) 0:00:53.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.028) 0:00:53.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:34:54 +0000 (0:00:00.030) 0:00:53.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.029) 0:00:53.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.029) 0:00:53.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.029) 0:00:53.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.029) 0:00:53.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.028) 0:00:53.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.034) 0:00:53.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.034) 0:00:53.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.029) 0:00:53.721 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.033) 0:00:53.754 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=283 changed=3 unreachable=0 failed=0 skipped=245 rescued=0 ignored=0 Wednesday 01 June 2022 16:34:55 +0000 (0:00:00.015) 0:00:53.770 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 5.28s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.46s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:2 --------------- linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Get the canonical device path for each member device -------------------- 0.80s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Get the canonical device path for each member device -------------------- 0.72s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : get required packages ---------------------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Collect info about the volumes. ----------------------------------------- 0.53s /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 ----------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:34:56 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:34:57 +0000 (0:00:01.243) 0:00:01.266 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvm_cache_then_remove_nvme_generated.yml **************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:34:57 +0000 (0:00:00.017) 0:00:01.283 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:34:58 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:34:59 +0000 (0:00:01.254) 0:00:01.277 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvm_cache_then_remove_scsi_generated.yml **************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove_scsi_generated.yml:3 Wednesday 01 June 2022 16:34:59 +0000 (0:00:00.014) 0:00:01.292 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove_scsi_generated.yml:7 Wednesday 01 June 2022 16:35:00 +0000 (0:00:01.090) 0:00:02.382 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:2 Wednesday 01 June 2022 16:35:00 +0000 (0:00:00.027) 0:00:02.409 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:14 Wednesday 01 June 2022 16:35:01 +0000 (0:00:00.793) 0:00:03.203 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:35:01 +0000 (0:00:00.038) 0:00:03.241 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:35:01 +0000 (0:00:00.154) 0:00:03.395 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:35:01 +0000 (0:00:00.525) 0:00:03.921 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:35:02 +0000 (0:00:00.076) 0:00:03.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:35:02 +0000 (0:00:00.023) 0:00:04.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:35:02 +0000 (0:00:00.022) 0:00:04.042 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:35:02 +0000 (0:00:00.190) 0:00:04.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:35:02 +0000 (0:00:00.019) 0:00:04.253 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:35:03 +0000 (0:00:01.082) 0:00:05.335 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:35:03 +0000 (0:00:00.047) 0:00:05.383 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:35:03 +0000 (0:00:00.044) 0:00:05.427 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:35:04 +0000 (0:00:00.666) 0:00:06.094 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:35:04 +0000 (0:00:00.077) 0:00:06.172 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:35:04 +0000 (0:00:00.021) 0:00:06.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:35:04 +0000 (0:00:00.021) 0:00:06.214 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:35:04 +0000 (0:00:00.020) 0:00:06.235 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:35:05 +0000 (0:00:00.797) 0:00:07.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:35:06 +0000 (0:00:01.839) 0:00:08.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:35:06 +0000 (0:00:00.044) 0:00:08.916 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:35:06 +0000 (0:00:00.027) 0:00:08.944 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.524) 0:00:09.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.029) 0:00:09.498 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.026) 0:00:09.525 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.039) 0:00:09.564 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.034) 0:00:09.599 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.034) 0:00:09.633 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.031) 0:00:09.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.029) 0:00:09.695 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.027) 0:00:09.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:35:07 +0000 (0:00:00.027) 0:00:09.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:35:08 +0000 (0:00:00.457) 0:00:10.207 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:35:08 +0000 (0:00:00.027) 0:00:10.235 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:17 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.819) 0:00:11.055 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:24 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.029) 0:00:11.084 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.041) 0:00:11.126 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.518) 0:00:11.645 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.035) 0:00:11.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.029) 0:00:11.711 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a cached LVM logical volume under volume group 'foo'] ************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:30 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.031) 0:00:11.743 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.053) 0:00:11.796 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:35:09 +0000 (0:00:00.042) 0:00:11.839 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.734) 0:00:12.574 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.067) 0:00:12.641 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.064) 0:00:12.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.029) 0:00:12.735 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.059) 0:00:12.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.026) 0:00:12.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.027) 0:00:12.849 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "foo", "volumes": [ { "cache_devices": [ "sdb" ], "cache_size": "4g", "cached": true, "name": "test", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.033) 0:00:12.882 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.029) 0:00:12.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.026) 0:00:12.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:35:10 +0000 (0:00:00.028) 0:00:12.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:35:11 +0000 (0:00:00.030) 0:00:12.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:35:11 +0000 (0:00:00.029) 0:00:13.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:35:11 +0000 (0:00:00.041) 0:00:13.069 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:35:11 +0000 (0:00:00.026) 0:00:13.095 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test" ], "mounts": [], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:35:14 +0000 (0:00:03.319) 0:00:16.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.032) 0:00:16.448 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.030) 0:00:16.479 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test" ], "mounts": [], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.039) 0:00:16.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.036) 0:00:16.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.032) 0:00:16.588 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.029) 0:00:16.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.031) 0:00:16.648 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.027) 0:00:16.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:35:14 +0000 (0:00:00.029) 0:00:16.705 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:35:15 +0000 (0:00:00.352) 0:00:17.058 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:35:15 +0000 (0:00:00.029) 0:00:17.087 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:44 Wednesday 01 June 2022 16:35:16 +0000 (0:00:00.921) 0:00:18.009 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:35:16 +0000 (0:00:00.089) 0:00:18.098 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [ "sdb" ], "cache_mode": null, "cache_size": "4g", "cached": true, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:35:16 +0000 (0:00:00.039) 0:00:18.137 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:35:16 +0000 (0:00:00.029) 0:00:18.167 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test", "size": "5G", "type": "lvm", "uuid": "c9f35bb0-4bbc-4e73-b909-6c0452c338bb" }, "/dev/mapper/foo-test_cache_cpool_cdata": { "fstype": "", "label": "", "name": "/dev/mapper/foo-test_cache_cpool_cdata", "size": "4G", "type": "lvm", "uuid": "" }, "/dev/mapper/foo-test_cache_cpool_cmeta": { "fstype": "", "label": "", "name": "/dev/mapper/foo-test_cache_cpool_cmeta", "size": "8M", "type": "lvm", "uuid": "" }, "/dev/mapper/foo-test_corig": { "fstype": "", "label": "", "name": "/dev/mapper/foo-test_corig", "size": "5G", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "3uTdDK-Vcm4-5C50-Fcf2-Vuk3-s4FU-c471iz" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "BrGSGe-pvMS-c43L-BDrn-7q0U-c8tc-DHHW2O" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:35:16 +0000 (0:00:00.470) 0:00:18.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002798", "end": "2022-06-01 12:35:16.572353", "rc": 0, "start": "2022-06-01 12:35:16.569555" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:35:17 +0000 (0:00:00.470) 0:00:19.108 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002340", "end": "2022-06-01 12:35:16.926739", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:35:16.924399" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:35:17 +0000 (0:00:00.351) 0:00:19.460 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:35:17 +0000 (0:00:00.063) 0:00:19.523 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:35:17 +0000 (0:00:00.032) 0:00:19.555 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:35:17 +0000 (0:00:00.067) 0:00:19.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:35:17 +0000 (0:00:00.040) 0:00:19.663 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.847) 0:00:20.510 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.049) 0:00:20.560 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.038) 0:00:20.598 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.035) 0:00:20.634 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.035) 0:00:20.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.030) 0:00:20.699 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.049) 0:00:20.749 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.057) 0:00:20.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.031) 0:00:20.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.032) 0:00:20.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.029) 0:00:20.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.031) 0:00:20.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:35:18 +0000 (0:00:00.032) 0:00:20.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.030) 0:00:20.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.028) 0:00:21.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.029) 0:00:21.053 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.056) 0:00:21.109 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.100) 0:00:21.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.030) 0:00:21.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.030) 0:00:21.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.029) 0:00:21.301 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.063) 0:00:21.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.033) 0:00:21.398 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.040) 0:00:21.439 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.072) 0:00:21.511 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.036) 0:00:21.548 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.036) 0:00:21.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.029) 0:00:21.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.030) 0:00:21.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.032) 0:00:21.678 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.031) 0:00:21.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.035) 0:00:21.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.034) 0:00:21.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.030) 0:00:21.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.030) 0:00:21.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.033) 0:00:21.875 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.032) 0:00:21.907 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:35:19 +0000 (0:00:00.033) 0:00:21.940 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.064) 0:00:22.005 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.062) 0:00:22.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.028) 0:00:22.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.028) 0:00:22.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.029) 0:00:22.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.030) 0:00:22.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.028) 0:00:22.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.031) 0:00:22.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.028) 0:00:22.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.031) 0:00:22.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.031) 0:00:22.335 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.064) 0:00:22.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.037) 0:00:22.437 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.145) 0:00:22.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.043) 0:00:22.626 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.042) 0:00:22.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.032) 0:00:22.700 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.036) 0:00:22.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.031) 0:00:22.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.030) 0:00:22.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.032) 0:00:22.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.030) 0:00:22.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.032) 0:00:22.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:35:20 +0000 (0:00:00.046) 0:00:22.942 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.036) 0:00:22.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.040) 0:00:23.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.078) 0:00:23.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.033) 0:00:23.131 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.037) 0:00:23.168 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.036) 0:00:23.205 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101313.8161216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101313.8161216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4300, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101313.8161216, "nlink": 1, "path": "/dev/mapper/foo-test", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.397) 0:00:23.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.040) 0:00:23.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.038) 0:00:23.681 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.037) 0:00:23.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.032) 0:00:23.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.035) 0:00:23.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.030) 0:00:23.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.031) 0:00:23.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.032) 0:00:23.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.043) 0:00:23.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:35:21 +0000 (0:00:00.031) 0:00:23.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.030) 0:00:23.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.030) 0:00:24.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.030) 0:00:24.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.030) 0:00:24.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.040) 0:00:24.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.035) 0:00:24.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.029) 0:00:24.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.028) 0:00:24.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.028) 0:00:24.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.032) 0:00:24.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.034) 0:00:24.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.031) 0:00:24.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.035) 0:00:24.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.035) 0:00:24.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.033) 0:00:24.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.032) 0:00:24.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:35:22 +0000 (0:00:00.041) 0:00:24.519 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.535) 0:00:25.054 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.374) 0:00:25.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.040) 0:00:25.468 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.035) 0:00:25.503 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.032) 0:00:25.536 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.037) 0:00:25.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.036) 0:00:25.610 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.036) 0:00:25.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.035) 0:00:25.682 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.039) 0:00:25.722 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.038) 0:00:25.761 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:35:23 +0000 (0:00:00.047) 0:00:25.808 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test" ], "delta": "0:00:00.035963", "end": "2022-06-01 12:35:23.855875", "rc": 0, "start": "2022-06-01 12:35:23.819912" } STDOUT: LVM2_LV_NAME=test LVM2_LV_ATTR=Cwi-a-C--- LVM2_CACHE_TOTAL_BLOCKS=65280 LVM2_CHUNK_SIZE=65536 LVM2_SEGTYPE=cache TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:35:24 +0000 (0:00:00.591) 0:00:26.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "cache" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:35:24 +0000 (0:00:00.041) 0:00:26.441 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:35:24 +0000 (0:00:00.041) 0:00:26.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_cache_size": [ "65280", "65536" ] }, "changed": false } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:35:24 +0000 (0:00:00.040) 0:00:26.523 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:35:24 +0000 (0:00:00.387) 0:00:26.911 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_cache_size": "4294967296" }, "changed": false } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:35:24 +0000 (0:00:00.040) 0:00:26.952 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.081) 0:00:27.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.032) 0:00:27.066 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.032) 0:00:27.098 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.028) 0:00:27.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove (detach) cache from the 'test' LV created above] ****************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:46 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.032) 0:00:27.159 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.072) 0:00:27.232 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.049) 0:00:27.281 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.529) 0:00:27.811 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.074) 0:00:27.885 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.040) 0:00:27.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:35:25 +0000 (0:00:00.036) 0:00:27.962 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.066) 0:00:28.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.027) 0:00:28.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.032) 0:00:28.089 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "foo", "volumes": [ { "cached": false, "name": "test", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.038) 0:00:28.128 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.033) 0:00:28.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.032) 0:00:28.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.031) 0:00:28.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.031) 0:00:28.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.030) 0:00:28.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.045) 0:00:28.332 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:35:26 +0000 (0:00:00.030) 0:00:28.363 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test_cache", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/foo-test", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:35:31 +0000 (0:00:05.299) 0:00:33.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:35:31 +0000 (0:00:00.032) 0:00:33.695 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:35:31 +0000 (0:00:00.029) 0:00:33.724 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test_cache", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/foo-test", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:35:31 +0000 (0:00:00.039) 0:00:33.763 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:35:31 +0000 (0:00:00.039) 0:00:33.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:35:31 +0000 (0:00:00.035) 0:00:33.838 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:35:31 +0000 (0:00:00.032) 0:00:33.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:35:31 +0000 (0:00:00.074) 0:00:33.946 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:35:32 +0000 (0:00:00.031) 0:00:33.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:35:32 +0000 (0:00:00.032) 0:00:34.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:35:32 +0000 (0:00:00.366) 0:00:34.376 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:35:32 +0000 (0:00:00.035) 0:00:34.412 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:58 Wednesday 01 June 2022 16:35:33 +0000 (0:00:00.875) 0:00:35.287 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:35:33 +0000 (0:00:00.057) 0:00:35.345 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:35:33 +0000 (0:00:00.040) 0:00:35.385 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:35:33 +0000 (0:00:00.030) 0:00:35.415 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test", "size": "5G", "type": "lvm", "uuid": "c9f35bb0-4bbc-4e73-b909-6c0452c338bb" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "3uTdDK-Vcm4-5C50-Fcf2-Vuk3-s4FU-c471iz" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "BrGSGe-pvMS-c43L-BDrn-7q0U-c8tc-DHHW2O" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:35:33 +0000 (0:00:00.384) 0:00:35.800 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002798", "end": "2022-06-01 12:35:33.652519", "rc": 0, "start": "2022-06-01 12:35:33.649721" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:35:34 +0000 (0:00:00.389) 0:00:36.190 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002517", "end": "2022-06-01 12:35:34.021217", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:35:34.018700" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:35:34 +0000 (0:00:00.369) 0:00:36.559 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:35:34 +0000 (0:00:00.066) 0:00:36.626 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:35:34 +0000 (0:00:00.032) 0:00:36.659 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:35:34 +0000 (0:00:00.063) 0:00:36.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:35:34 +0000 (0:00:00.039) 0:00:36.762 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.744) 0:00:37.507 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.064) 0:00:37.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.075) 0:00:37.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.036) 0:00:37.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.033) 0:00:37.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.027) 0:00:37.745 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.050) 0:00:37.796 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.071) 0:00:37.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.033) 0:00:37.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.029) 0:00:37.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:35:35 +0000 (0:00:00.029) 0:00:37.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.032) 0:00:37.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.029) 0:00:38.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.027) 0:00:38.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.027) 0:00:38.078 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.029) 0:00:38.107 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.057) 0:00:38.164 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.062) 0:00:38.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.029) 0:00:38.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.028) 0:00:38.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.032) 0:00:38.317 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.062) 0:00:38.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.036) 0:00:38.416 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.042) 0:00:38.458 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.074) 0:00:38.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.039) 0:00:38.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.041) 0:00:38.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.031) 0:00:38.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.033) 0:00:38.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.030) 0:00:38.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.032) 0:00:38.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.036) 0:00:38.778 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.037) 0:00:38.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.030) 0:00:38.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.029) 0:00:38.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.028) 0:00:38.905 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.030) 0:00:38.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:35:36 +0000 (0:00:00.030) 0:00:38.966 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.065) 0:00:39.031 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.063) 0:00:39.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.032) 0:00:39.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.031) 0:00:39.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.033) 0:00:39.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.031) 0:00:39.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.030) 0:00:39.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.032) 0:00:39.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.032) 0:00:39.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.031) 0:00:39.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.034) 0:00:39.384 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.061) 0:00:39.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.036) 0:00:39.482 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.164) 0:00:39.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.035) 0:00:39.682 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.043) 0:00:39.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.031) 0:00:39.757 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.035) 0:00:39.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.031) 0:00:39.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.034) 0:00:39.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.036) 0:00:39.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.033) 0:00:39.929 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:35:37 +0000 (0:00:00.031) 0:00:39.960 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.047) 0:00:40.007 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.035) 0:00:40.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.034) 0:00:40.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.033) 0:00:40.110 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.031) 0:00:40.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.036) 0:00:40.179 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.036) 0:00:40.216 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101329.8031216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101329.8031216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4300, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101329.8031216, "nlink": 1, "path": "/dev/mapper/foo-test", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.370) 0:00:40.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.037) 0:00:40.624 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.036) 0:00:40.660 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.034) 0:00:40.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.032) 0:00:40.726 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.035) 0:00:40.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.029) 0:00:40.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.030) 0:00:40.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.029) 0:00:40.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.038) 0:00:40.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.033) 0:00:40.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:35:38 +0000 (0:00:00.030) 0:00:40.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.030) 0:00:40.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.034) 0:00:41.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.033) 0:00:41.052 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.038) 0:00:41.091 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.036) 0:00:41.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.033) 0:00:41.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.030) 0:00:41.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.030) 0:00:41.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.031) 0:00:41.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.030) 0:00:41.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.031) 0:00:41.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.030) 0:00:41.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.031) 0:00:41.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.031) 0:00:41.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.030) 0:00:41.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.030) 0:00:41.470 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:35:39 +0000 (0:00:00.386) 0:00:41.856 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.367) 0:00:42.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.040) 0:00:42.264 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.074) 0:00:42.338 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.031) 0:00:42.369 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.031) 0:00:42.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.030) 0:00:42.432 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.039) 0:00:42.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.042) 0:00:42.514 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.042) 0:00:42.556 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.035) 0:00:42.592 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:35:40 +0000 (0:00:00.039) 0:00:42.632 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test" ], "delta": "0:00:00.035051", "end": "2022-06-01 12:35:40.497639", "rc": 0, "start": "2022-06-01 12:35:40.462588" } STDOUT: LVM2_LV_NAME=test LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.403) 0:00:43.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.038) 0:00:43.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.038) 0:00:43.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.030) 0:00:43.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.032) 0:00:43.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.031) 0:00:43.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.031) 0:00:43.238 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.029) 0:00:43.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.030) 0:00:43.298 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.026) 0:00:43.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:60 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.031) 0:00:43.357 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.069) 0:00:43.427 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:35:41 +0000 (0:00:00.046) 0:00:43.473 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.523) 0:00:43.997 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.072) 0:00:44.070 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.030) 0:00:44.100 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.030) 0:00:44.131 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.061) 0:00:44.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.025) 0:00:44.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.030) 0:00:44.248 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "foo", "state": "absent", "volumes": [ { "name": "test", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.037) 0:00:44.286 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.031) 0:00:44.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.029) 0:00:44.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.030) 0:00:44.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.029) 0:00:44.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.028) 0:00:44.436 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.049) 0:00:44.485 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:35:42 +0000 (0:00:00.034) 0:00:44.519 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:35:45 +0000 (0:00:02.708) 0:00:47.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.033) 0:00:47.262 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.029) 0:00:47.291 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.041) 0:00:47.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.040) 0:00:47.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.035) 0:00:47.408 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.029) 0:00:47.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.031) 0:00:47.468 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.031) 0:00:47.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.028) 0:00:47.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.367) 0:00:47.897 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:35:45 +0000 (0:00:00.029) 0:00:47.926 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:72 Wednesday 01 June 2022 16:35:46 +0000 (0:00:00.811) 0:00:48.737 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:35:46 +0000 (0:00:00.064) 0:00:48.801 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test", "_mount_id": "/dev/mapper/foo-test", "_raw_device": "/dev/mapper/foo-test", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:35:46 +0000 (0:00:00.038) 0:00:48.840 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:35:46 +0000 (0:00:00.027) 0:00:48.868 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:35:47 +0000 (0:00:00.378) 0:00:49.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002643", "end": "2022-06-01 12:35:47.078728", "rc": 0, "start": "2022-06-01 12:35:47.076085" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:35:47 +0000 (0:00:00.369) 0:00:49.615 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003501", "end": "2022-06-01 12:35:47.469057", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:35:47.465556" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.395) 0:00:50.010 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.063) 0:00:50.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.105 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.060) 0:00:50.166 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.037) 0:00:50.203 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.234 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.029) 0:00:50.263 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.036) 0:00:50.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.034) 0:00:50.334 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.034) 0:00:50.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.029) 0:00:50.397 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.060) 0:00:50.458 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.055) 0:00:50.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.029) 0:00:50.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.029) 0:00:50.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.029) 0:00:50.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.029) 0:00:50.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.753 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.059) 0:00:50.812 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.060) 0:00:50.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.030) 0:00:50.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:35:48 +0000 (0:00:00.031) 0:00:50.966 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.060) 0:00:51.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.034) 0:00:51.061 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.028) 0:00:51.089 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.030) 0:00:51.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.030) 0:00:51.150 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.059) 0:00:51.210 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.061) 0:00:51.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.029) 0:00:51.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.029) 0:00:51.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.030) 0:00:51.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.029) 0:00:51.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.029) 0:00:51.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.031) 0:00:51.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.029) 0:00:51.481 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.030) 0:00:51.511 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.030) 0:00:51.542 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.057) 0:00:51.599 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.036) 0:00:51.636 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.124) 0:00:51.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.035) 0:00:51.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.048) 0:00:51.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.033) 0:00:51.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.036) 0:00:51.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:35:49 +0000 (0:00:00.030) 0:00:51.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.034) 0:00:51.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.031) 0:00:52.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.033) 0:00:52.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.030) 0:00:52.075 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.046) 0:00:52.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.024) 0:00:52.147 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.036) 0:00:52.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.028) 0:00:52.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.030) 0:00:52.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.028) 0:00:52.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.025) 0:00:52.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.382) 0:00:52.679 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.037) 0:00:52.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.026) 0:00:52.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.034) 0:00:52.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.030) 0:00:52.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.025) 0:00:52.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.030) 0:00:52.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.030) 0:00:52.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.029) 0:00:52.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:35:50 +0000 (0:00:00.027) 0:00:52.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.032) 0:00:52.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.028) 0:00:53.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.101 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.038) 0:00:53.139 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.034) 0:00:53.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.028) 0:00:53.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.028) 0:00:53.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.028) 0:00:53.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.031) 0:00:53.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.033) 0:00:53.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.594 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.033) 0:00:53.627 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.657 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.034) 0:00:53.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.031) 0:00:53.723 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.029) 0:00:53.784 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.033) 0:00:53.817 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.033) 0:00:53.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.032) 0:00:53.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:35:51 +0000 (0:00:00.030) 0:00:53.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.032) 0:00:53.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.029) 0:00:54.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.029) 0:00:54.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.032) 0:00:54.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.030) 0:00:54.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.028) 0:00:54.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.032) 0:00:54.159 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.028) 0:00:54.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=285 changed=3 unreachable=0 failed=0 skipped=245 rescued=0 ignored=0 Wednesday 01 June 2022 16:35:52 +0000 (0:00:00.016) 0:00:54.204 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 5.30s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove_scsi_generated.yml:3 linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Get the canonical device path for each member device -------------------- 0.85s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.79s /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml:2 --------------- Get the canonical device path for each member device -------------------- 0.74s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : get required packages ---------------------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Get information about the LV -------------------------------------------- 0.59s /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 ------------------------- parse the actual size of the volume ------------------------------------- 0.54s /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 -------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:35:53 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:35:54 +0000 (0:00:01.246) 0:00:01.269 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvm_pool_then_remove.yml ******************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:2 Wednesday 01 June 2022 16:35:54 +0000 (0:00:00.015) 0:00:01.285 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:15 Wednesday 01 June 2022 16:35:55 +0000 (0:00:01.072) 0:00:02.357 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:35:55 +0000 (0:00:00.038) 0:00:02.396 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:35:55 +0000 (0:00:00.156) 0:00:02.552 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:35:56 +0000 (0:00:00.527) 0:00:03.079 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:35:56 +0000 (0:00:00.075) 0:00:03.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:35:56 +0000 (0:00:00.022) 0:00:03.177 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:35:56 +0000 (0:00:00.022) 0:00:03.200 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:35:56 +0000 (0:00:00.189) 0:00:03.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:35:56 +0000 (0:00:00.018) 0:00:03.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:35:57 +0000 (0:00:01.093) 0:00:04.502 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:35:57 +0000 (0:00:00.046) 0:00:04.549 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:35:57 +0000 (0:00:00.044) 0:00:04.593 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:35:58 +0000 (0:00:00.672) 0:00:05.266 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:35:58 +0000 (0:00:00.080) 0:00:05.346 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:35:58 +0000 (0:00:00.020) 0:00:05.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:35:58 +0000 (0:00:00.022) 0:00:05.389 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:35:58 +0000 (0:00:00.020) 0:00:05.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:35:59 +0000 (0:00:00.829) 0:00:06.239 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:36:01 +0000 (0:00:01.818) 0:00:08.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.043) 0:00:08.101 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.027) 0:00:08.129 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.538) 0:00:08.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.030) 0:00:08.698 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.027) 0:00:08.725 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.033) 0:00:08.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.032) 0:00:08.791 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.032) 0:00:08.823 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.059) 0:00:08.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.029) 0:00:08.912 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.027) 0:00:08.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:36:01 +0000 (0:00:00.029) 0:00:08.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:36:02 +0000 (0:00:00.447) 0:00:09.418 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:36:02 +0000 (0:00:00.027) 0:00:09.445 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:18 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.844) 0:00:10.290 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:25 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.028) 0:00:10.318 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.042) 0:00:10.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.454) 0:00:10.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.035) 0:00:10.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.029) 0:00:10.879 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create two LVM logical volumes under volume group 'foo'] ***************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:30 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.032) 0:00:10.912 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.053) 0:00:10.966 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:36:03 +0000 (0:00:00.041) 0:00:11.008 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.498) 0:00:11.506 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.066) 0:00:11.573 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.029) 0:00:11.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.028) 0:00:11.632 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.061) 0:00:11.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.024) 0:00:11.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.028) 0:00:11.746 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.034) 0:00:11.781 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.033) 0:00:11.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.029) 0:00:11.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.028) 0:00:11.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.028) 0:00:11.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.028) 0:00:11.930 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.041) 0:00:11.971 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:36:04 +0000 (0:00:00.028) 0:00:11.999 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:36:07 +0000 (0:00:02.023) 0:00:14.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:36:07 +0000 (0:00:00.029) 0:00:14.052 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:36:07 +0000 (0:00:00.027) 0:00:14.080 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:36:07 +0000 (0:00:00.040) 0:00:14.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:36:07 +0000 (0:00:00.038) 0:00:14.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:36:07 +0000 (0:00:00.032) 0:00:14.192 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:36:07 +0000 (0:00:00.027) 0:00:14.219 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:36:08 +0000 (0:00:00.931) 0:00:15.151 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:36:09 +0000 (0:00:00.926) 0:00:16.077 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:36:09 +0000 (0:00:00.667) 0:00:16.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:36:10 +0000 (0:00:00.360) 0:00:17.105 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:36:10 +0000 (0:00:00.029) 0:00:17.135 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:45 Wednesday 01 June 2022 16:36:10 +0000 (0:00:00.873) 0:00:18.008 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:36:11 +0000 (0:00:00.054) 0:00:18.063 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:36:11 +0000 (0:00:00.042) 0:00:18.105 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:36:11 +0000 (0:00:00.030) 0:00:18.135 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" }, "/dev/mapper/foo-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "c905981e-623c-4176-83ae-87ef17f23f74" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Ic8xpi-COVC-MgLK-Uyoy-4LeV-9u8a-4QoXKq" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:36:11 +0000 (0:00:00.518) 0:00:18.654 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003252", "end": "2022-06-01 12:36:11.594639", "rc": 0, "start": "2022-06-01 12:36:11.591387" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:36:12 +0000 (0:00:00.519) 0:00:19.173 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003202", "end": "2022-06-01 12:36:11.968093", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:36:11.964891" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:36:12 +0000 (0:00:00.371) 0:00:19.545 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:36:12 +0000 (0:00:00.076) 0:00:19.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:36:12 +0000 (0:00:00.031) 0:00:19.653 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:36:12 +0000 (0:00:00.103) 0:00:19.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:36:12 +0000 (0:00:00.039) 0:00:19.795 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.482) 0:00:20.278 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.041) 0:00:20.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.037) 0:00:20.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.035) 0:00:20.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.034) 0:00:20.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.029) 0:00:20.457 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.041) 0:00:20.499 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.058) 0:00:20.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.029) 0:00:20.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.029) 0:00:20.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.031) 0:00:20.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.032) 0:00:20.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.029) 0:00:20.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.032) 0:00:20.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.033) 0:00:20.775 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.033) 0:00:20.808 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.062) 0:00:20.870 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.073) 0:00:20.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:36:13 +0000 (0:00:00.033) 0:00:20.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.033) 0:00:21.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.030) 0:00:21.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.029) 0:00:21.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.029) 0:00:21.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.029) 0:00:21.129 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.062) 0:00:21.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.036) 0:00:21.229 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.034) 0:00:21.263 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.057) 0:00:21.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.036) 0:00:21.357 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.035) 0:00:21.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.028) 0:00:21.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.028) 0:00:21.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.029) 0:00:21.480 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.030) 0:00:21.510 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.031) 0:00:21.542 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.060) 0:00:21.602 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.077) 0:00:21.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.031) 0:00:21.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.029) 0:00:21.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.029) 0:00:21.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.031) 0:00:21.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.072) 0:00:21.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.030) 0:00:21.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.030) 0:00:21.937 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.029) 0:00:21.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:36:14 +0000 (0:00:00.028) 0:00:21.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.028) 0:00:22.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.029) 0:00:22.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.027) 0:00:22.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.030) 0:00:22.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.027) 0:00:22.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.028) 0:00:22.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.031) 0:00:22.199 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.035) 0:00:22.235 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.075) 0:00:22.310 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.035) 0:00:22.345 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.127) 0:00:22.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.035) 0:00:22.508 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.046) 0:00:22.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.052) 0:00:22.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.036) 0:00:22.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.038) 0:00:22.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.034) 0:00:22.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.029) 0:00:22.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.031) 0:00:22.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.032) 0:00:22.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.049) 0:00:22.860 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.037) 0:00:22.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.037) 0:00:22.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.031) 0:00:22.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:36:15 +0000 (0:00:00.031) 0:00:22.998 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.036) 0:00:23.034 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.036) 0:00:23.071 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101366.3771214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101366.3771214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4751, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101366.3771214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.367) 0:00:23.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.038) 0:00:23.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.038) 0:00:23.514 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.035) 0:00:23.550 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.031) 0:00:23.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.036) 0:00:23.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.035) 0:00:23.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.030) 0:00:23.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.032) 0:00:23.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.037) 0:00:23.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.030) 0:00:23.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.032) 0:00:23.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.030) 0:00:23.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.030) 0:00:23.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.032) 0:00:23.910 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.037) 0:00:23.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:36:16 +0000 (0:00:00.035) 0:00:23.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.030) 0:00:24.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.029) 0:00:24.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.029) 0:00:24.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.080) 0:00:24.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.031) 0:00:24.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.030) 0:00:24.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.029) 0:00:24.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.030) 0:00:24.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.030) 0:00:24.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.031) 0:00:24.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.029) 0:00:24.367 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:36:17 +0000 (0:00:00.476) 0:00:24.844 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.388) 0:00:25.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.038) 0:00:25.271 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.034) 0:00:25.305 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.031) 0:00:25.336 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.031) 0:00:25.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.028) 0:00:25.396 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.030) 0:00:25.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.032) 0:00:25.459 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.034) 0:00:25.493 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.031) 0:00:25.525 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.042) 0:00:25.568 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.032998", "end": "2022-06-01 12:36:18.408628", "rc": 0, "start": "2022-06-01 12:36:18.375630" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:36:18 +0000 (0:00:00.419) 0:00:25.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.038) 0:00:26.025 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.039) 0:00:26.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.032) 0:00:26.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.030) 0:00:26.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.031) 0:00:26.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.031) 0:00:26.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.030) 0:00:26.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.035) 0:00:26.257 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.117) 0:00:26.374 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.033) 0:00:26.407 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "c905981e-623c-4176-83ae-87ef17f23f74" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "c905981e-623c-4176-83ae-87ef17f23f74" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.041) 0:00:26.449 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.038) 0:00:26.488 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.034) 0:00:26.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.036) 0:00:26.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.030) 0:00:26.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.031) 0:00:26.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.030) 0:00:26.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.036) 0:00:26.688 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.056) 0:00:26.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.036) 0:00:26.780 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.037) 0:00:26.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.030) 0:00:26.848 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.034) 0:00:26.883 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:36:19 +0000 (0:00:00.088) 0:00:26.971 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.038) 0:00:27.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101366.1401215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101366.1401215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4717, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101366.1401215, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.373) 0:00:27.382 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.038) 0:00:27.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.036) 0:00:27.457 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.034) 0:00:27.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.030) 0:00:27.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.038) 0:00:27.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.032) 0:00:27.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.031) 0:00:27.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.031) 0:00:27.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.037) 0:00:27.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.031) 0:00:27.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.034) 0:00:27.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.031) 0:00:27.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.031) 0:00:27.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.030) 0:00:27.854 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.038) 0:00:27.892 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.035) 0:00:27.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.031) 0:00:27.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:36:20 +0000 (0:00:00.028) 0:00:27.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.029) 0:00:28.018 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.029) 0:00:28.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.028) 0:00:28.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.028) 0:00:28.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.030) 0:00:28.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.028) 0:00:28.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.028) 0:00:28.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.028) 0:00:28.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.031) 0:00:28.252 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.365) 0:00:28.617 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:36:21 +0000 (0:00:00.372) 0:00:28.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.037) 0:00:29.028 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.034) 0:00:29.062 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.034) 0:00:29.097 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.031) 0:00:29.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.031) 0:00:29.160 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.029) 0:00:29.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.030) 0:00:29.220 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.034) 0:00:29.255 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.035) 0:00:29.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.039) 0:00:29.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.031705", "end": "2022-06-01 12:36:22.158870", "rc": 0, "start": "2022-06-01 12:36:22.127165" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.408) 0:00:29.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.039) 0:00:29.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.038) 0:00:29.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.032) 0:00:29.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.035) 0:00:29.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.038) 0:00:29.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.032) 0:00:29.956 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:36:22 +0000 (0:00:00.032) 0:00:29.988 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.034) 0:00:30.022 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.030) 0:00:30.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove one of the LVM logical volumes in 'foo' created above] ************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:47 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.032) 0:00:30.085 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.066) 0:00:30.151 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.044) 0:00:30.196 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.510) 0:00:30.706 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.071) 0:00:30.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.031) 0:00:30.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.030) 0:00:30.841 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.062) 0:00:30.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.026) 0:00:30.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.031) 0:00:30.962 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:36:23 +0000 (0:00:00.039) 0:00:31.001 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:36:24 +0000 (0:00:00.034) 0:00:31.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:36:24 +0000 (0:00:00.033) 0:00:31.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:36:24 +0000 (0:00:00.030) 0:00:31.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:36:24 +0000 (0:00:00.030) 0:00:31.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:36:24 +0000 (0:00:00.030) 0:00:31.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:36:24 +0000 (0:00:00.044) 0:00:31.205 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:36:24 +0000 (0:00:00.028) 0:00:31.234 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:36:26 +0000 (0:00:01.905) 0:00:33.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:36:26 +0000 (0:00:00.034) 0:00:33.175 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:36:26 +0000 (0:00:00.029) 0:00:33.204 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:36:26 +0000 (0:00:00.043) 0:00:33.248 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:36:26 +0000 (0:00:00.042) 0:00:33.291 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:36:26 +0000 (0:00:00.036) 0:00:33.327 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:36:26 +0000 (0:00:00.387) 0:00:33.715 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:36:27 +0000 (0:00:00.660) 0:00:34.375 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:36:27 +0000 (0:00:00.375) 0:00:34.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:36:28 +0000 (0:00:00.665) 0:00:35.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:36:28 +0000 (0:00:00.367) 0:00:35.784 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:36:28 +0000 (0:00:00.029) 0:00:35.814 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:64 Wednesday 01 June 2022 16:36:29 +0000 (0:00:00.912) 0:00:36.726 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:36:29 +0000 (0:00:00.094) 0:00:36.820 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:36:29 +0000 (0:00:00.042) 0:00:36.863 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:36:29 +0000 (0:00:00.030) 0:00:36.893 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Ic8xpi-COVC-MgLK-Uyoy-4LeV-9u8a-4QoXKq" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:36:30 +0000 (0:00:00.373) 0:00:37.267 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002726", "end": "2022-06-01 12:36:30.054799", "rc": 0, "start": "2022-06-01 12:36:30.052073" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:36:30 +0000 (0:00:00.365) 0:00:37.633 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002678", "end": "2022-06-01 12:36:30.416524", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:36:30.413846" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:36:30 +0000 (0:00:00.367) 0:00:38.000 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.069) 0:00:38.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.031) 0:00:38.101 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.060) 0:00:38.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.038) 0:00:38.200 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.382) 0:00:38.583 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.040) 0:00:38.624 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.041) 0:00:38.666 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.038) 0:00:38.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.035) 0:00:38.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.033) 0:00:38.773 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.043) 0:00:38.817 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.059) 0:00:38.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.031) 0:00:38.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.030) 0:00:38.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.032) 0:00:38.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:36:31 +0000 (0:00:00.033) 0:00:39.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.031) 0:00:39.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.030) 0:00:39.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.030) 0:00:39.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.126 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.058) 0:00:39.185 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.069) 0:00:39.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.031) 0:00:39.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.028) 0:00:39.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.431 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.096) 0:00:39.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.042) 0:00:39.570 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.035) 0:00:39.606 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.057) 0:00:39.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.038) 0:00:39.702 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.035) 0:00:39.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.029) 0:00:39.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.030) 0:00:39.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.038) 0:00:39.897 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:36:32 +0000 (0:00:00.064) 0:00:39.961 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.080) 0:00:40.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.032) 0:00:40.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.030) 0:00:40.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.030) 0:00:40.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.030) 0:00:40.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.032) 0:00:40.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.031) 0:00:40.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.029) 0:00:40.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.031) 0:00:40.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.031) 0:00:40.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.030) 0:00:40.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.032) 0:00:40.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.029) 0:00:40.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.029) 0:00:40.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.029) 0:00:40.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.029) 0:00:40.504 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.030) 0:00:40.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.033) 0:00:40.567 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.069) 0:00:40.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.035) 0:00:40.672 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.122) 0:00:40.795 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.038) 0:00:40.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.042) 0:00:40.876 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.036) 0:00:40.913 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.034) 0:00:40.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:36:33 +0000 (0:00:00.037) 0:00:40.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.030) 0:00:41.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.033) 0:00:41.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.028) 0:00:41.078 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.028) 0:00:41.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.043) 0:00:41.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.035) 0:00:41.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.036) 0:00:41.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.030) 0:00:41.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.030) 0:00:41.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.041) 0:00:41.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.043) 0:00:41.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101366.3771214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101366.3771214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4751, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101366.3771214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.378) 0:00:41.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.038) 0:00:41.786 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.037) 0:00:41.823 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.034) 0:00:41.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.030) 0:00:41.888 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.038) 0:00:41.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.032) 0:00:41.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:36:34 +0000 (0:00:00.031) 0:00:41.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.031) 0:00:42.021 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.039) 0:00:42.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.031) 0:00:42.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.033) 0:00:42.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.030) 0:00:42.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.030) 0:00:42.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.031) 0:00:42.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.043) 0:00:42.262 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.038) 0:00:42.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.033) 0:00:42.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.030) 0:00:42.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.030) 0:00:42.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.031) 0:00:42.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.032) 0:00:42.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.030) 0:00:42.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.033) 0:00:42.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.031) 0:00:42.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.032) 0:00:42.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.033) 0:00:42.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:36:35 +0000 (0:00:00.031) 0:00:42.654 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.374) 0:00:43.029 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.360) 0:00:43.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.038) 0:00:43.428 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.033) 0:00:43.461 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.032) 0:00:43.494 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.030) 0:00:43.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.032) 0:00:43.557 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.032) 0:00:43.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.029) 0:00:43.619 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.037) 0:00:43.656 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.037) 0:00:43.694 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:36:36 +0000 (0:00:00.040) 0:00:43.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.034982", "end": "2022-06-01 12:36:36.564467", "rc": 0, "start": "2022-06-01 12:36:36.529485" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.416) 0:00:44.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.038) 0:00:44.189 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.038) 0:00:44.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.032) 0:00:44.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.032) 0:00:44.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.034) 0:00:44.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.032) 0:00:44.359 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.031) 0:00:44.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.035) 0:00:44.426 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.122) 0:00:44.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.036) 0:00:44.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.088) 0:00:44.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.030) 0:00:44.705 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.035) 0:00:44.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.030) 0:00:44.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.030) 0:00:44.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.030) 0:00:44.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.033) 0:00:44.866 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.036) 0:00:44.902 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.050) 0:00:44.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:36:37 +0000 (0:00:00.026) 0:00:44.979 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.037) 0:00:45.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.031) 0:00:45.048 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.032) 0:00:45.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.028) 0:00:45.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.024) 0:00:45.133 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.366) 0:00:45.500 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.038) 0:00:45.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.026) 0:00:45.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.033) 0:00:45.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.032) 0:00:45.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.026) 0:00:45.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.031) 0:00:45.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.031) 0:00:45.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.030) 0:00:45.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.027) 0:00:45.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.033) 0:00:45.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.030) 0:00:45.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.029) 0:00:45.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.030) 0:00:45.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.031) 0:00:45.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:36:38 +0000 (0:00:00.036) 0:00:45.971 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.039) 0:00:46.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.032) 0:00:46.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.033) 0:00:46.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.029) 0:00:46.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.448 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.033) 0:00:46.481 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.511 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.574 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.635 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.032) 0:00:46.668 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.033) 0:00:46.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.032) 0:00:46.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.030) 0:00:46.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.031) 0:00:46.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:36:39 +0000 (0:00:00.033) 0:00:46.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.070) 0:00:47.024 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.032) 0:00:47.056 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.028) 0:00:47.085 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:66 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.030) 0:00:47.116 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.072) 0:00:47.189 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.043) 0:00:47.233 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.514) 0:00:47.748 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.070) 0:00:47.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.031) 0:00:47.850 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.030) 0:00:47.880 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.062) 0:00:47.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.025) 0:00:47.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:36:40 +0000 (0:00:00.029) 0:00:47.998 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.041) 0:00:48.039 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.031) 0:00:48.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.029) 0:00:48.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.032) 0:00:48.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.030) 0:00:48.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.029) 0:00:48.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.043) 0:00:48.236 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:36:41 +0000 (0:00:00.026) 0:00:48.263 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:36:42 +0000 (0:00:01.338) 0:00:49.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:36:42 +0000 (0:00:00.032) 0:00:49.634 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:36:42 +0000 (0:00:00.031) 0:00:49.665 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:36:42 +0000 (0:00:00.042) 0:00:49.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:36:42 +0000 (0:00:00.038) 0:00:49.746 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:36:42 +0000 (0:00:00.034) 0:00:49.780 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:36:42 +0000 (0:00:00.030) 0:00:49.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:36:43 +0000 (0:00:00.669) 0:00:50.480 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:36:43 +0000 (0:00:00.388) 0:00:50.869 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:36:44 +0000 (0:00:00.715) 0:00:51.585 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:36:44 +0000 (0:00:00.363) 0:00:51.948 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:36:44 +0000 (0:00:00.031) 0:00:51.979 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:82 Wednesday 01 June 2022 16:36:45 +0000 (0:00:00.839) 0:00:52.818 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:36:45 +0000 (0:00:00.068) 0:00:52.887 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:36:45 +0000 (0:00:00.051) 0:00:52.938 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:36:45 +0000 (0:00:00.031) 0:00:52.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Ic8xpi-COVC-MgLK-Uyoy-4LeV-9u8a-4QoXKq" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:36:46 +0000 (0:00:00.369) 0:00:53.340 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.005265", "end": "2022-06-01 12:36:46.138812", "rc": 0, "start": "2022-06-01 12:36:46.133547" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:36:46 +0000 (0:00:00.378) 0:00:53.718 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002668", "end": "2022-06-01 12:36:46.516205", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:36:46.513537" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.374) 0:00:54.093 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.069) 0:00:54.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.030) 0:00:54.193 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.059) 0:00:54.252 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.037) 0:00:54.290 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.390) 0:00:54.680 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.042) 0:00:54.722 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.038) 0:00:54.761 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.036) 0:00:54.798 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.035) 0:00:54.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.031) 0:00:54.865 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.051) 0:00:54.916 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:36:47 +0000 (0:00:00.057) 0:00:54.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.078) 0:00:55.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.032) 0:00:55.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.031) 0:00:55.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.030) 0:00:55.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.030) 0:00:55.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.030) 0:00:55.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.032) 0:00:55.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.030) 0:00:55.272 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.057) 0:00:55.329 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.071) 0:00:55.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.029) 0:00:55.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.030) 0:00:55.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.030) 0:00:55.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.031) 0:00:55.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.033) 0:00:55.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.030) 0:00:55.587 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.060) 0:00:55.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.036) 0:00:55.685 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.036) 0:00:55.721 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.058) 0:00:55.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.035) 0:00:55.815 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.035) 0:00:55.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.031) 0:00:55.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.031) 0:00:55.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.028) 0:00:55.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.032) 0:00:55.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:36:48 +0000 (0:00:00.031) 0:00:56.006 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.064) 0:00:56.071 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.077) 0:00:56.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.031) 0:00:56.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.031) 0:00:56.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.030) 0:00:56.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.030) 0:00:56.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.359 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.032) 0:00:56.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.030) 0:00:56.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.030) 0:00:56.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.031) 0:00:56.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.030) 0:00:56.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.029) 0:00:56.663 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.068) 0:00:56.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.033) 0:00:56.766 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.181) 0:00:56.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:36:49 +0000 (0:00:00.041) 0:00:56.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "fb29c34c-de85-4c51-84c9-76b6ffe7090a" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.045) 0:00:57.034 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.038) 0:00:57.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.037) 0:00:57.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.037) 0:00:57.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.029) 0:00:57.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.029) 0:00:57.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.030) 0:00:57.237 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.030) 0:00:57.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.048) 0:00:57.316 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.036) 0:00:57.352 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.040) 0:00:57.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.030) 0:00:57.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.033) 0:00:57.456 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.039) 0:00:57.495 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.040) 0:00:57.536 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101366.3771214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101366.3771214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4751, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101366.3771214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.383) 0:00:57.920 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.038) 0:00:57.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:36:50 +0000 (0:00:00.036) 0:00:57.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.035) 0:00:58.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.030) 0:00:58.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.035) 0:00:58.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.031) 0:00:58.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.029) 0:00:58.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.029) 0:00:58.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.035) 0:00:58.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.029) 0:00:58.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.029) 0:00:58.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.031) 0:00:58.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.030) 0:00:58.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.029) 0:00:58.374 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.036) 0:00:58.410 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.034) 0:00:58.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.029) 0:00:58.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.032) 0:00:58.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.029) 0:00:58.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.031) 0:00:58.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.030) 0:00:58.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.031) 0:00:58.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.031) 0:00:58.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.036) 0:00:58.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.032) 0:00:58.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.032) 0:00:58.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:36:51 +0000 (0:00:00.032) 0:00:58.793 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.385) 0:00:59.179 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.377) 0:00:59.556 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.041) 0:00:59.598 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.037) 0:00:59.635 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.033) 0:00:59.669 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.034) 0:00:59.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.073) 0:00:59.776 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.030) 0:00:59.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.032) 0:00:59.840 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.037) 0:00:59.877 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.040) 0:00:59.917 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:36:52 +0000 (0:00:00.039) 0:00:59.957 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035486", "end": "2022-06-01 12:36:52.796600", "rc": 0, "start": "2022-06-01 12:36:52.761114" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.416) 0:01:00.374 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.039) 0:01:00.414 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.038) 0:01:00.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.032) 0:01:00.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.032) 0:01:00.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.031) 0:01:00.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.034) 0:01:00.583 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.030) 0:01:00.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.035) 0:01:00.649 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.122) 0:01:00.771 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.035) 0:01:00.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.041) 0:01:00.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.032) 0:01:00.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.035) 0:01:00.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.034) 0:01:00.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:36:53 +0000 (0:00:00.031) 0:01:00.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.031) 0:01:01.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.029) 0:01:01.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.031) 0:01:01.075 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.043) 0:01:01.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.023) 0:01:01.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.032) 0:01:01.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.030) 0:01:01.206 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.032) 0:01:01.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.042) 0:01:01.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.031) 0:01:01.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.379) 0:01:01.692 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.037) 0:01:01.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.027) 0:01:01.757 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.035) 0:01:01.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.031) 0:01:01.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.027) 0:01:01.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.033) 0:01:01.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.031) 0:01:01.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.029) 0:01:01.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.026) 0:01:01.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:36:54 +0000 (0:00:00.030) 0:01:02.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.033) 0:01:02.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.034) 0:01:02.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.029) 0:01:02.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.029) 0:01:02.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.040) 0:01:02.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.040) 0:01:02.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.093) 0:01:02.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.032) 0:01:02.369 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.032) 0:01:02.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.030) 0:01:02.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.032) 0:01:02.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.034) 0:01:02.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.032) 0:01:02.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.034) 0:01:02.723 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.033) 0:01:02.757 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.788 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.041) 0:01:02.861 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.031) 0:01:02.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.034) 0:01:02.927 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.036) 0:01:02.963 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:36:55 +0000 (0:00:00.032) 0:01:02.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.029) 0:01:03.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.031) 0:01:03.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.028) 0:01:03.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.030) 0:01:03.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.027) 0:01:03.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.031) 0:01:03.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.029) 0:01:03.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.029) 0:01:03.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.030) 0:01:03.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.032) 0:01:03.297 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.025) 0:01:03.322 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove both of the LVM logical volumes in 'foo' created above] *********** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:84 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.027) 0:01:03.350 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.082) 0:01:03.432 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:36:56 +0000 (0:00:00.049) 0:01:03.481 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.531) 0:01:04.013 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.076) 0:01:04.090 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.030) 0:01:04.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.029) 0:01:04.149 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.060) 0:01:04.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.025) 0:01:04.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.028) 0:01:04.264 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.035) 0:01:04.299 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.033) 0:01:04.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.032) 0:01:04.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.032) 0:01:04.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.034) 0:01:04.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.031) 0:01:04.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.046) 0:01:04.510 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:36:57 +0000 (0:00:00.031) 0:01:04.541 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:36:59 +0000 (0:00:01.897) 0:01:06.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:36:59 +0000 (0:00:00.031) 0:01:06.470 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:36:59 +0000 (0:00:00.029) 0:01:06.500 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:36:59 +0000 (0:00:00.042) 0:01:06.543 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:36:59 +0000 (0:00:00.039) 0:01:06.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:36:59 +0000 (0:00:00.033) 0:01:06.616 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:36:59 +0000 (0:00:00.380) 0:01:06.997 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:37:00 +0000 (0:00:00.659) 0:01:07.657 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:37:00 +0000 (0:00:00.032) 0:01:07.689 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:37:01 +0000 (0:00:00.620) 0:01:08.310 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:37:01 +0000 (0:00:00.355) 0:01:08.666 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:37:01 +0000 (0:00:00.030) 0:01:08.696 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:100 Wednesday 01 June 2022 16:37:02 +0000 (0:00:00.832) 0:01:09.528 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:37:02 +0000 (0:00:00.073) 0:01:09.601 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:37:02 +0000 (0:00:00.047) 0:01:09.649 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:37:02 +0000 (0:00:00.032) 0:01:09.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:37:03 +0000 (0:00:00.391) 0:01:10.073 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002549", "end": "2022-06-01 12:37:02.864901", "rc": 0, "start": "2022-06-01 12:37:02.862352" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:37:03 +0000 (0:00:00.419) 0:01:10.493 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002510", "end": "2022-06-01 12:37:03.274231", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:37:03.271721" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:37:03 +0000 (0:00:00.355) 0:01:10.848 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:37:03 +0000 (0:00:00.071) 0:01:10.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:37:03 +0000 (0:00:00.029) 0:01:10.948 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:37:03 +0000 (0:00:00.059) 0:01:11.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.041) 0:01:11.049 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.027) 0:01:11.076 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.027) 0:01:11.104 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.038) 0:01:11.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.035) 0:01:11.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.034) 0:01:11.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.030) 0:01:11.243 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.026) 0:01:11.270 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.054) 0:01:11.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.030) 0:01:11.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.030) 0:01:11.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.027) 0:01:11.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.027) 0:01:11.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.028) 0:01:11.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.030) 0:01:11.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.029) 0:01:11.529 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.031) 0:01:11.560 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.055) 0:01:11.616 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.068) 0:01:11.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.031) 0:01:11.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.029) 0:01:11.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.030) 0:01:11.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.029) 0:01:11.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.031) 0:01:11.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.029) 0:01:11.866 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.060) 0:01:11.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.034) 0:01:11.961 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:37:04 +0000 (0:00:00.026) 0:01:11.988 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.026) 0:01:12.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.033) 0:01:12.048 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.061) 0:01:12.109 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.090) 0:01:12.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.033) 0:01:12.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.035) 0:01:12.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.034) 0:01:12.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.031) 0:01:12.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.028) 0:01:12.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.033) 0:01:12.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.030) 0:01:12.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.031) 0:01:12.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.028) 0:01:12.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.028) 0:01:12.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.030) 0:01:12.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.031) 0:01:12.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.074) 0:01:12.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.032) 0:01:12.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.029) 0:01:12.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.030) 0:01:12.746 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.029) 0:01:12.776 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.068) 0:01:12.844 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.036) 0:01:12.880 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:37:05 +0000 (0:00:00.122) 0:01:13.003 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.037) 0:01:13.040 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.039) 0:01:13.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.029) 0:01:13.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.035) 0:01:13.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.028) 0:01:13.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.028) 0:01:13.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.029) 0:01:13.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.028) 0:01:13.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.030) 0:01:13.291 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.048) 0:01:13.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.026) 0:01:13.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.035) 0:01:13.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.029) 0:01:13.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.030) 0:01:13.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.028) 0:01:13.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.026) 0:01:13.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.374) 0:01:13.891 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.036) 0:01:13.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.024) 0:01:13.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:37:06 +0000 (0:00:00.031) 0:01:13.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.028) 0:01:14.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.024) 0:01:14.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.031) 0:01:14.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.028) 0:01:14.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.024) 0:01:14.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.028) 0:01:14.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.028) 0:01:14.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.031) 0:01:14.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.035) 0:01:14.336 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.032) 0:01:14.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.030) 0:01:14.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.030) 0:01:14.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.028) 0:01:14.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.031) 0:01:14.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.030) 0:01:14.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.030) 0:01:14.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.029) 0:01:14.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.037) 0:01:14.794 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.079) 0:01:14.874 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.042) 0:01:14.916 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.034) 0:01:14.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:37:07 +0000 (0:00:00.031) 0:01:14.983 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.044 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.037) 0:01:15.081 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.033) 0:01:15.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.029) 0:01:15.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.029) 0:01:15.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.032) 0:01:15.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.029) 0:01:15.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.029) 0:01:15.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.028) 0:01:15.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.036) 0:01:15.423 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.123) 0:01:15.547 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.036) 0:01:15.583 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.040) 0:01:15.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.028) 0:01:15.652 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.034) 0:01:15.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.031) 0:01:15.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.032) 0:01:15.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.811 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.045) 0:01:15.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.024) 0:01:15.912 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.038) 0:01:15.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:37:08 +0000 (0:00:00.030) 0:01:15.981 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.031) 0:01:16.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.030) 0:01:16.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.024) 0:01:16.068 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.352) 0:01:16.420 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.036) 0:01:16.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.024) 0:01:16.481 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.033) 0:01:16.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.028) 0:01:16.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.024) 0:01:16.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.028) 0:01:16.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.027) 0:01:16.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.028) 0:01:16.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.028) 0:01:16.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.029) 0:01:16.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.029) 0:01:16.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.028) 0:01:16.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.029) 0:01:16.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.028) 0:01:16.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.038) 0:01:16.863 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.033) 0:01:16.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.029) 0:01:16.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.032) 0:01:16.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:37:09 +0000 (0:00:00.030) 0:01:16.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.033) 0:01:17.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.032) 0:01:17.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.073) 0:01:17.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.030) 0:01:17.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.030) 0:01:17.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.028) 0:01:17.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.028) 0:01:17.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.028) 0:01:17.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.031) 0:01:17.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.030) 0:01:17.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.031) 0:01:17.369 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.031) 0:01:17.401 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.031) 0:01:17.432 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.030) 0:01:17.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.033) 0:01:17.496 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.031) 0:01:17.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.029) 0:01:17.557 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.034) 0:01:17.592 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.033) 0:01:17.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.030) 0:01:17.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.034) 0:01:17.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.031) 0:01:17.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.028) 0:01:17.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.028) 0:01:17.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.028) 0:01:17.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.028) 0:01:17.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.032) 0:01:17.868 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.030) 0:01:17.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.030) 0:01:17.929 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.027) 0:01:17.957 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=513 changed=6 unreachable=0 failed=0 skipped=509 rescued=0 ignored=0 Wednesday 01 June 2022 16:37:10 +0000 (0:00:00.015) 0:01:17.973 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:2 ---------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : set up new/current mounts ------------------ 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : get required packages ---------------------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:37:11 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:37:13 +0000 (0:00:01.288) 0:00:01.311 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvm_pool_then_remove_nvme_generated.yml ***************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:37:13 +0000 (0:00:00.017) 0:00:01.329 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:37:13 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:37:15 +0000 (0:00:01.289) 0:00:01.312 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvm_pool_then_remove_scsi_generated.yml ***************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove_scsi_generated.yml:3 Wednesday 01 June 2022 16:37:15 +0000 (0:00:00.017) 0:00:01.329 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove_scsi_generated.yml:7 Wednesday 01 June 2022 16:37:16 +0000 (0:00:01.096) 0:00:02.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:2 Wednesday 01 June 2022 16:37:16 +0000 (0:00:00.025) 0:00:02.452 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:15 Wednesday 01 June 2022 16:37:16 +0000 (0:00:00.816) 0:00:03.269 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:37:17 +0000 (0:00:00.038) 0:00:03.307 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:37:17 +0000 (0:00:00.151) 0:00:03.459 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:37:17 +0000 (0:00:00.523) 0:00:03.982 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:37:17 +0000 (0:00:00.075) 0:00:04.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:37:17 +0000 (0:00:00.024) 0:00:04.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:37:17 +0000 (0:00:00.023) 0:00:04.105 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:37:18 +0000 (0:00:00.208) 0:00:04.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:37:18 +0000 (0:00:00.019) 0:00:04.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:37:19 +0000 (0:00:01.079) 0:00:05.414 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:37:19 +0000 (0:00:00.045) 0:00:05.459 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:37:19 +0000 (0:00:00.042) 0:00:05.501 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:37:19 +0000 (0:00:00.664) 0:00:06.166 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:37:19 +0000 (0:00:00.085) 0:00:06.251 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:37:19 +0000 (0:00:00.020) 0:00:06.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:37:20 +0000 (0:00:00.022) 0:00:06.294 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:37:20 +0000 (0:00:00.018) 0:00:06.313 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:37:20 +0000 (0:00:00.786) 0:00:07.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:37:22 +0000 (0:00:01.823) 0:00:08.923 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:37:22 +0000 (0:00:00.046) 0:00:08.969 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:37:22 +0000 (0:00:00.027) 0:00:08.997 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.533) 0:00:09.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.029) 0:00:09.561 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.026) 0:00:09.587 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.031) 0:00:09.619 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.030) 0:00:09.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.032) 0:00:09.682 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.027) 0:00:09.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.028) 0:00:09.738 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.026) 0:00:09.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.028) 0:00:09.792 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.446) 0:00:10.238 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:37:23 +0000 (0:00:00.027) 0:00:10.266 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:18 Wednesday 01 June 2022 16:37:24 +0000 (0:00:00.981) 0:00:11.248 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:25 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.030) 0:00:11.279 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.044) 0:00:11.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.549) 0:00:11.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.035) 0:00:11.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.029) 0:00:11.938 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create two LVM logical volumes under volume group 'foo'] ***************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:30 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.030) 0:00:11.969 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.053) 0:00:12.022 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:37:25 +0000 (0:00:00.040) 0:00:12.063 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.507) 0:00:12.570 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.068) 0:00:12.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.029) 0:00:12.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.028) 0:00:12.697 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.058) 0:00:12.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.061) 0:00:12.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.030) 0:00:12.847 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.037) 0:00:12.885 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.036) 0:00:12.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.028) 0:00:12.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.029) 0:00:12.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.031) 0:00:13.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.028) 0:00:13.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.042) 0:00:13.082 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:37:26 +0000 (0:00:00.028) 0:00:13.110 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:37:28 +0000 (0:00:02.022) 0:00:15.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:37:28 +0000 (0:00:00.031) 0:00:15.164 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:37:28 +0000 (0:00:00.027) 0:00:15.192 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:37:28 +0000 (0:00:00.050) 0:00:15.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:37:29 +0000 (0:00:00.041) 0:00:15.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:37:29 +0000 (0:00:00.037) 0:00:15.321 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:37:29 +0000 (0:00:00.030) 0:00:15.352 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:37:29 +0000 (0:00:00.914) 0:00:16.266 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:37:30 +0000 (0:00:00.904) 0:00:17.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:37:31 +0000 (0:00:00.622) 0:00:17.793 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:37:31 +0000 (0:00:00.372) 0:00:18.166 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:37:31 +0000 (0:00:00.030) 0:00:18.196 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:45 Wednesday 01 June 2022 16:37:32 +0000 (0:00:00.845) 0:00:19.041 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:37:32 +0000 (0:00:00.054) 0:00:19.096 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:37:32 +0000 (0:00:00.046) 0:00:19.143 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:37:32 +0000 (0:00:00.035) 0:00:19.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" }, "/dev/mapper/foo-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "d88a5dcd-1c25-44b5-b394-ef8a782750c9" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "0fsMiH-rHke-0Gxy-TbuO-zjeY-xhx9-ApXB5z" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:37:33 +0000 (0:00:00.512) 0:00:19.691 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002797", "end": "2022-06-01 12:37:33.305568", "rc": 0, "start": "2022-06-01 12:37:33.302771" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:37:33 +0000 (0:00:00.465) 0:00:20.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002573", "end": "2022-06-01 12:37:33.678113", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:37:33.675540" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.364) 0:00:20.521 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.070) 0:00:20.591 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.032) 0:00:20.623 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.064) 0:00:20.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.039) 0:00:20.727 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.454) 0:00:21.181 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.041) 0:00:21.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:37:34 +0000 (0:00:00.038) 0:00:21.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.035) 0:00:21.296 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.035) 0:00:21.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.030) 0:00:21.362 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.041) 0:00:21.404 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.068) 0:00:21.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.037) 0:00:21.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.038) 0:00:21.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.032) 0:00:21.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.032) 0:00:21.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.030) 0:00:21.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.031) 0:00:21.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.031) 0:00:21.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.032) 0:00:21.739 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.063) 0:00:21.802 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.071) 0:00:21.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.029) 0:00:21.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.031) 0:00:21.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.030) 0:00:21.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.030) 0:00:21.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.031) 0:00:22.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.029) 0:00:22.057 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.091) 0:00:22.148 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.049) 0:00:22.198 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:37:35 +0000 (0:00:00.040) 0:00:22.238 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.061) 0:00:22.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.082) 0:00:22.382 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.035) 0:00:22.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.028) 0:00:22.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.029) 0:00:22.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.029) 0:00:22.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.030) 0:00:22.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.031) 0:00:22.567 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.062) 0:00:22.629 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.080) 0:00:22.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.035) 0:00:22.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.031) 0:00:22.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.037) 0:00:22.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.043) 0:00:22.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.032) 0:00:22.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.040) 0:00:22.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.032) 0:00:22.963 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.032) 0:00:22.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.031) 0:00:23.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.032) 0:00:23.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.030) 0:00:23.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.031) 0:00:23.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.031) 0:00:23.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.030) 0:00:23.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.030) 0:00:23.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:37:36 +0000 (0:00:00.035) 0:00:23.250 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.032) 0:00:23.282 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.069) 0:00:23.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.034) 0:00:23.387 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.131) 0:00:23.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.036) 0:00:23.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.042) 0:00:23.597 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.037) 0:00:23.635 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.035) 0:00:23.670 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.039) 0:00:23.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.030) 0:00:23.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.031) 0:00:23.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.030) 0:00:23.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.032) 0:00:23.835 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.047) 0:00:23.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.037) 0:00:23.919 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.037) 0:00:23.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.031) 0:00:23.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.032) 0:00:24.021 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.037) 0:00:24.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:37:37 +0000 (0:00:00.036) 0:00:24.095 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101448.2201216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101448.2201216, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4982, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101448.2201216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.386) 0:00:24.482 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.082) 0:00:24.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.037) 0:00:24.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.037) 0:00:24.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.038) 0:00:24.678 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.037) 0:00:24.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.030) 0:00:24.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.033) 0:00:24.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.030) 0:00:24.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.038) 0:00:24.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.030) 0:00:24.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.029) 0:00:24.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.030) 0:00:24.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.034) 0:00:24.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.031) 0:00:25.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.037) 0:00:25.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.036) 0:00:25.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.030) 0:00:25.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.030) 0:00:25.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.034) 0:00:25.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.031) 0:00:25.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.031) 0:00:25.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:37:38 +0000 (0:00:00.030) 0:00:25.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:37:39 +0000 (0:00:00.032) 0:00:25.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:37:39 +0000 (0:00:00.031) 0:00:25.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:37:39 +0000 (0:00:00.034) 0:00:25.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:37:39 +0000 (0:00:00.031) 0:00:25.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:37:39 +0000 (0:00:00.031) 0:00:25.428 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:37:39 +0000 (0:00:00.470) 0:00:25.899 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:37:39 +0000 (0:00:00.370) 0:00:26.269 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.039) 0:00:26.309 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.034) 0:00:26.344 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.031) 0:00:26.375 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.031) 0:00:26.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.030) 0:00:26.437 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.030) 0:00:26.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.031) 0:00:26.498 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.035) 0:00:26.533 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.034) 0:00:26.568 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.041) 0:00:26.609 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035561", "end": "2022-06-01 12:37:40.171676", "rc": 0, "start": "2022-06-01 12:37:40.136115" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.409) 0:00:27.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.040) 0:00:27.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.038) 0:00:27.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.031) 0:00:27.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.031) 0:00:27.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.031) 0:00:27.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.041) 0:00:27.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:37:40 +0000 (0:00:00.036) 0:00:27.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.038) 0:00:27.312 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.176) 0:00:27.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.036) 0:00:27.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "d88a5dcd-1c25-44b5-b394-ef8a782750c9" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "d88a5dcd-1c25-44b5-b394-ef8a782750c9" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.041) 0:00:27.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.038) 0:00:27.605 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.034) 0:00:27.640 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.039) 0:00:27.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.030) 0:00:27.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.030) 0:00:27.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.031) 0:00:27.771 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.032) 0:00:27.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.047) 0:00:27.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.035) 0:00:27.887 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.038) 0:00:27.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.031) 0:00:27.957 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.034) 0:00:27.992 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.037) 0:00:28.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:37:41 +0000 (0:00:00.035) 0:00:28.064 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101447.9751215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101447.9751215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4948, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101447.9751215, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.383) 0:00:28.448 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.037) 0:00:28.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.036) 0:00:28.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.033) 0:00:28.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.032) 0:00:28.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.035) 0:00:28.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.030) 0:00:28.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.029) 0:00:28.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.029) 0:00:28.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.039) 0:00:28.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.033) 0:00:28.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.034) 0:00:28.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.032) 0:00:28.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.032) 0:00:28.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.031) 0:00:28.918 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.038) 0:00:28.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.040) 0:00:28.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.031) 0:00:29.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.051) 0:00:29.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.039) 0:00:29.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.036) 0:00:29.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.037) 0:00:29.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.034) 0:00:29.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:37:42 +0000 (0:00:00.031) 0:00:29.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.030) 0:00:29.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.030) 0:00:29.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.029) 0:00:29.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.031) 0:00:29.382 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.368) 0:00:29.750 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.361) 0:00:30.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.038) 0:00:30.150 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.036) 0:00:30.186 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:37:43 +0000 (0:00:00.082) 0:00:30.269 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.032) 0:00:30.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.030) 0:00:30.332 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.030) 0:00:30.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.030) 0:00:30.394 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.033) 0:00:30.428 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.036) 0:00:30.465 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.038) 0:00:30.503 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.036724", "end": "2022-06-01 12:37:44.075579", "rc": 0, "start": "2022-06-01 12:37:44.038855" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.418) 0:00:30.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.039) 0:00:30.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.038) 0:00:31.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.033) 0:00:31.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.034) 0:00:31.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.032) 0:00:31.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.032) 0:00:31.133 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.031) 0:00:31.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.031) 0:00:31.195 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.027) 0:00:31.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove one of the LVM logical volumes in 'foo' created above] ************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:47 Wednesday 01 June 2022 16:37:44 +0000 (0:00:00.033) 0:00:31.257 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.064) 0:00:31.321 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.044) 0:00:31.366 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.514) 0:00:31.881 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.070) 0:00:31.952 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.032) 0:00:31.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.031) 0:00:32.016 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.065) 0:00:32.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.026) 0:00:32.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.031) 0:00:32.139 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.038) 0:00:32.178 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.035) 0:00:32.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.031) 0:00:32.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:37:45 +0000 (0:00:00.030) 0:00:32.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:37:46 +0000 (0:00:00.030) 0:00:32.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:37:46 +0000 (0:00:00.036) 0:00:32.342 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:37:46 +0000 (0:00:00.049) 0:00:32.392 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:37:46 +0000 (0:00:00.031) 0:00:32.424 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:37:47 +0000 (0:00:01.785) 0:00:34.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:37:47 +0000 (0:00:00.032) 0:00:34.243 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:37:47 +0000 (0:00:00.028) 0:00:34.272 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:37:48 +0000 (0:00:00.042) 0:00:34.314 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:37:48 +0000 (0:00:00.041) 0:00:34.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:37:48 +0000 (0:00:00.086) 0:00:34.443 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:37:48 +0000 (0:00:00.378) 0:00:34.821 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:37:49 +0000 (0:00:00.660) 0:00:35.481 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:37:49 +0000 (0:00:00.372) 0:00:35.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:37:50 +0000 (0:00:00.646) 0:00:36.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:37:50 +0000 (0:00:00.399) 0:00:36.900 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:37:50 +0000 (0:00:00.031) 0:00:36.931 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:64 Wednesday 01 June 2022 16:37:51 +0000 (0:00:00.844) 0:00:37.775 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:37:51 +0000 (0:00:00.054) 0:00:37.830 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:37:51 +0000 (0:00:00.041) 0:00:37.871 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:37:51 +0000 (0:00:00.031) 0:00:37.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "0fsMiH-rHke-0Gxy-TbuO-zjeY-xhx9-ApXB5z" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:37:52 +0000 (0:00:00.378) 0:00:38.281 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002702", "end": "2022-06-01 12:37:51.795049", "rc": 0, "start": "2022-06-01 12:37:51.792347" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:37:52 +0000 (0:00:00.361) 0:00:38.642 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002789", "end": "2022-06-01 12:37:52.178458", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:37:52.175669" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:37:52 +0000 (0:00:00.383) 0:00:39.026 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:37:52 +0000 (0:00:00.070) 0:00:39.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:37:52 +0000 (0:00:00.032) 0:00:39.128 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:37:52 +0000 (0:00:00.072) 0:00:39.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:37:52 +0000 (0:00:00.040) 0:00:39.241 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.360) 0:00:39.601 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.042) 0:00:39.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.038) 0:00:39.682 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.037) 0:00:39.720 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.036) 0:00:39.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.029) 0:00:39.786 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.041) 0:00:39.827 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.057) 0:00:39.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.030) 0:00:39.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.028) 0:00:39.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.029) 0:00:39.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.029) 0:00:40.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.030) 0:00:40.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.033) 0:00:40.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.030) 0:00:40.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.030) 0:00:40.129 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.063) 0:00:40.193 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:37:53 +0000 (0:00:00.075) 0:00:40.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.031) 0:00:40.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.030) 0:00:40.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.030) 0:00:40.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.029) 0:00:40.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.032) 0:00:40.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.028) 0:00:40.452 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.059) 0:00:40.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.035) 0:00:40.548 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.037) 0:00:40.585 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.058) 0:00:40.644 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.036) 0:00:40.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.037) 0:00:40.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.031) 0:00:40.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.032) 0:00:40.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.030) 0:00:40.811 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.030) 0:00:40.842 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.030) 0:00:40.872 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.064) 0:00:40.937 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.079) 0:00:41.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.029) 0:00:41.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.031) 0:00:41.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.029) 0:00:41.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.028) 0:00:41.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.036) 0:00:41.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.032) 0:00:41.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.030) 0:00:41.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:37:54 +0000 (0:00:00.035) 0:00:41.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.031) 0:00:41.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.031) 0:00:41.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.031) 0:00:41.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.030) 0:00:41.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.030) 0:00:41.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.092) 0:00:41.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.031) 0:00:41.551 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.031) 0:00:41.583 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.031) 0:00:41.615 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.072) 0:00:41.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.035) 0:00:41.722 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.124) 0:00:41.847 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.035) 0:00:41.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.043) 0:00:41.927 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.038) 0:00:41.965 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.039) 0:00:42.005 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.038) 0:00:42.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.030) 0:00:42.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.029) 0:00:42.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.029) 0:00:42.133 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.031) 0:00:42.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.050) 0:00:42.216 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:37:55 +0000 (0:00:00.036) 0:00:42.252 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.038) 0:00:42.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.030) 0:00:42.322 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.031) 0:00:42.354 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.036) 0:00:42.390 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.038) 0:00:42.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101448.2201216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101448.2201216, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4982, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101448.2201216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.380) 0:00:42.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.038) 0:00:42.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.036) 0:00:42.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.037) 0:00:42.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.034) 0:00:42.955 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.040) 0:00:42.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.033) 0:00:43.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.032) 0:00:43.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.030) 0:00:43.092 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.038) 0:00:43.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.034) 0:00:43.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.031) 0:00:43.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.034) 0:00:43.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:37:56 +0000 (0:00:00.030) 0:00:43.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.030) 0:00:43.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.037) 0:00:43.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.036) 0:00:43.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.028) 0:00:43.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.030) 0:00:43.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.029) 0:00:43.455 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.029) 0:00:43.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.030) 0:00:43.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.031) 0:00:43.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.030) 0:00:43.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.032) 0:00:43.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.029) 0:00:43.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.030) 0:00:43.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.030) 0:00:43.700 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:37:57 +0000 (0:00:00.376) 0:00:44.076 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.382) 0:00:44.459 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.035) 0:00:44.495 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.031) 0:00:44.527 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.028) 0:00:44.555 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.027) 0:00:44.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.029) 0:00:44.613 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.027) 0:00:44.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.028) 0:00:44.669 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.031) 0:00:44.700 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.030) 0:00:44.730 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.035) 0:00:44.766 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.033811", "end": "2022-06-01 12:37:58.325553", "rc": 0, "start": "2022-06-01 12:37:58.291742" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.408) 0:00:45.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.038) 0:00:45.213 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:37:58 +0000 (0:00:00.040) 0:00:45.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.033) 0:00:45.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.032) 0:00:45.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.033) 0:00:45.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.032) 0:00:45.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.034) 0:00:45.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.035) 0:00:45.456 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.126) 0:00:45.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.036) 0:00:45.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.039) 0:00:45.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.034) 0:00:45.692 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.040) 0:00:45.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.032) 0:00:45.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.029) 0:00:45.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.030) 0:00:45.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.029) 0:00:45.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.031) 0:00:45.887 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.047) 0:00:45.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.026) 0:00:45.961 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.035) 0:00:45.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.029) 0:00:46.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.031) 0:00:46.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.030) 0:00:46.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:37:59 +0000 (0:00:00.030) 0:00:46.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.375) 0:00:46.494 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.037) 0:00:46.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.026) 0:00:46.558 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.033) 0:00:46.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.029) 0:00:46.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.026) 0:00:46.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.030) 0:00:46.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.075) 0:00:46.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.032) 0:00:46.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.028) 0:00:46.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.032) 0:00:46.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.031) 0:00:46.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.030) 0:00:46.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.033) 0:00:46.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.030) 0:00:46.973 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.037) 0:00:47.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.034) 0:00:47.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.030) 0:00:47.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.030) 0:00:47.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.031) 0:00:47.138 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.029) 0:00:47.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.029) 0:00:47.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.032) 0:00:47.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:38:00 +0000 (0:00:00.031) 0:00:47.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.033) 0:00:47.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.029) 0:00:47.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.029) 0:00:47.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.028) 0:00:47.477 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.035) 0:00:47.512 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.029) 0:00:47.542 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.029) 0:00:47.603 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.030) 0:00:47.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.029) 0:00:47.664 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.035) 0:00:47.700 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.032) 0:00:47.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.030) 0:00:47.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.029) 0:00:47.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.032) 0:00:47.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:47.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.032) 0:00:47.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.031) 0:00:48.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.030) 0:00:48.046 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.030) 0:00:48.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:66 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.029) 0:00:48.106 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.071) 0:00:48.177 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:38:01 +0000 (0:00:00.046) 0:00:48.224 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.489) 0:00:48.713 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.070) 0:00:48.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.035) 0:00:48.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.035) 0:00:48.855 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.064) 0:00:48.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.027) 0:00:48.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.030) 0:00:48.979 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.076) 0:00:49.056 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.035) 0:00:49.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.031) 0:00:49.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.035) 0:00:49.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.042) 0:00:49.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:38:02 +0000 (0:00:00.037) 0:00:49.238 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:38:03 +0000 (0:00:00.052) 0:00:49.291 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:38:03 +0000 (0:00:00.028) 0:00:49.319 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:38:04 +0000 (0:00:01.350) 0:00:50.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:38:04 +0000 (0:00:00.031) 0:00:50.701 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:38:04 +0000 (0:00:00.028) 0:00:50.729 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:38:04 +0000 (0:00:00.042) 0:00:50.771 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:38:04 +0000 (0:00:00.045) 0:00:50.816 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:38:04 +0000 (0:00:00.033) 0:00:50.850 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:38:04 +0000 (0:00:00.029) 0:00:50.879 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:38:05 +0000 (0:00:00.660) 0:00:51.539 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:38:05 +0000 (0:00:00.413) 0:00:51.953 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:38:06 +0000 (0:00:00.647) 0:00:52.601 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:38:06 +0000 (0:00:00.381) 0:00:52.982 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:38:06 +0000 (0:00:00.030) 0:00:53.013 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:82 Wednesday 01 June 2022 16:38:07 +0000 (0:00:00.851) 0:00:53.864 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:38:07 +0000 (0:00:00.060) 0:00:53.925 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:38:07 +0000 (0:00:00.042) 0:00:53.968 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:38:07 +0000 (0:00:00.030) 0:00:53.998 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "0fsMiH-rHke-0Gxy-TbuO-zjeY-xhx9-ApXB5z" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:38:08 +0000 (0:00:00.392) 0:00:54.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003031", "end": "2022-06-01 12:38:07.951486", "rc": 0, "start": "2022-06-01 12:38:07.948455" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:38:08 +0000 (0:00:00.409) 0:00:54.800 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002415", "end": "2022-06-01 12:38:08.328405", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:38:08.325990" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:38:08 +0000 (0:00:00.373) 0:00:55.173 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:38:08 +0000 (0:00:00.068) 0:00:55.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:38:08 +0000 (0:00:00.029) 0:00:55.272 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.060) 0:00:55.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.036) 0:00:55.368 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.365) 0:00:55.734 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.040) 0:00:55.774 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.040) 0:00:55.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.035) 0:00:55.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.035) 0:00:55.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.028) 0:00:55.913 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.053) 0:00:55.966 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.061) 0:00:56.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.032) 0:00:56.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.031) 0:00:56.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.032) 0:00:56.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.032) 0:00:56.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.031) 0:00:56.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.032) 0:00:56.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:38:09 +0000 (0:00:00.032) 0:00:56.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.033) 0:00:56.287 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.061) 0:00:56.349 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.073) 0:00:56.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.032) 0:00:56.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:56.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:56.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.029) 0:00:56.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:56.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:56.605 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.060) 0:00:56.665 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.035) 0:00:56.701 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.033) 0:00:56.734 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.061) 0:00:56.795 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.036) 0:00:56.832 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.038) 0:00:56.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:56.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.029) 0:00:56.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.031) 0:00:56.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.033) 0:00:56.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:57.027 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.063) 0:00:57.091 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.120) 0:00:57.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:57.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:38:10 +0000 (0:00:00.030) 0:00:57.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.031) 0:00:57.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.033) 0:00:57.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.030) 0:00:57.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.030) 0:00:57.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.030) 0:00:57.429 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.035) 0:00:57.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.033) 0:00:57.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.033) 0:00:57.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.031) 0:00:57.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.030) 0:00:57.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.030) 0:00:57.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.029) 0:00:57.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.032) 0:00:57.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.034) 0:00:57.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.031) 0:00:57.753 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.072) 0:00:57.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.036) 0:00:57.862 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.125) 0:00:57.988 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.039) 0:00:58.028 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1290774, "block_size": 4096, "block_total": 1308160, "block_used": 17386, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2621437, "inode_total": 2621440, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 5287010304, "size_total": 5358223360, "uuid": "27d4e5af-0347-468f-86a3-968ae205e2d0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.046) 0:00:58.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.037) 0:00:58.111 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.036) 0:00:58.148 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.037) 0:00:58.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.042) 0:00:58.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:38:11 +0000 (0:00:00.030) 0:00:58.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.030) 0:00:58.289 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.031) 0:00:58.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.046) 0:00:58.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.034) 0:00:58.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.038) 0:00:58.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.030) 0:00:58.470 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.031) 0:00:58.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.036) 0:00:58.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.035) 0:00:58.573 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101448.2201216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101448.2201216, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4982, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101448.2201216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.386) 0:00:58.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.039) 0:00:58.999 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.037) 0:00:59.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.034) 0:00:59.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.030) 0:00:59.102 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.036) 0:00:59.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.035) 0:00:59.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.033) 0:00:59.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:38:12 +0000 (0:00:00.033) 0:00:59.240 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.037) 0:00:59.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.029) 0:00:59.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.030) 0:00:59.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.030) 0:00:59.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.032) 0:00:59.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.034) 0:00:59.436 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.082) 0:00:59.519 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.035) 0:00:59.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.030) 0:00:59.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.030) 0:00:59.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.030) 0:00:59.645 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.031) 0:00:59.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.032) 0:00:59.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.031) 0:00:59.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.030) 0:00:59.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.032) 0:00:59.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.031) 0:00:59.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.029) 0:00:59.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.031) 0:00:59.896 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:38:13 +0000 (0:00:00.372) 0:01:00.269 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.366) 0:01:00.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.039) 0:01:00.675 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.033) 0:01:00.708 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.031) 0:01:00.740 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.031) 0:01:00.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.029) 0:01:00.801 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.030) 0:01:00.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.032) 0:01:00.864 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.035) 0:01:00.899 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.034) 0:01:00.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:38:14 +0000 (0:00:00.043) 0:01:00.976 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037352", "end": "2022-06-01 12:38:14.541744", "rc": 0, "start": "2022-06-01 12:38:14.504392" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.412) 0:01:01.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.040) 0:01:01.430 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.051) 0:01:01.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.033) 0:01:01.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.039) 0:01:01.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.033) 0:01:01.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.033) 0:01:01.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.032) 0:01:01.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.036) 0:01:01.690 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.129) 0:01:01.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.036) 0:01:01.856 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.043) 0:01:01.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.029) 0:01:01.930 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.035) 0:01:01.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.030) 0:01:01.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.032) 0:01:02.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.031) 0:01:02.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.032) 0:01:02.093 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.034) 0:01:02.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.047) 0:01:02.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.027) 0:01:02.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:38:15 +0000 (0:00:00.040) 0:01:02.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.075) 0:01:02.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.032) 0:01:02.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.032) 0:01:02.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.026) 0:01:02.408 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.363) 0:01:02.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.037) 0:01:02.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.027) 0:01:02.835 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.036) 0:01:02.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.030) 0:01:02.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.026) 0:01:02.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.030) 0:01:02.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.031) 0:01:02.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.031) 0:01:03.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.030) 0:01:03.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.030) 0:01:03.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.030) 0:01:03.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.029) 0:01:03.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.028) 0:01:03.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.030) 0:01:03.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:38:16 +0000 (0:00:00.040) 0:01:03.244 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.033) 0:01:03.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.029) 0:01:03.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.027) 0:01:03.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.027) 0:01:03.362 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.028) 0:01:03.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.030) 0:01:03.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.030) 0:01:03.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.030) 0:01:03.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.029) 0:01:03.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:03.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:03.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.033) 0:01:03.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:03.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.030) 0:01:03.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:03.703 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.033) 0:01:03.737 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.030) 0:01:03.768 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.034) 0:01:03.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:03.834 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:03.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.033) 0:01:03.898 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.036) 0:01:03.935 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.035) 0:01:03.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.034) 0:01:04.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:04.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.030) 0:01:04.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:04.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.034) 0:01:04.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.035) 0:01:04.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.038) 0:01:04.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.031) 0:01:04.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:38:17 +0000 (0:00:00.029) 0:01:04.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.030) 0:01:04.300 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.028) 0:01:04.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove both of the LVM logical volumes in 'foo' created above] *********** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:84 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.030) 0:01:04.359 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.082) 0:01:04.442 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.042) 0:01:04.485 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.590) 0:01:05.076 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.073) 0:01:05.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.031) 0:01:05.180 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:38:18 +0000 (0:00:00.031) 0:01:05.211 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.064) 0:01:05.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.025) 0:01:05.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.031) 0:01:05.333 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.038) 0:01:05.371 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.034) 0:01:05.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.030) 0:01:05.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.030) 0:01:05.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.030) 0:01:05.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.031) 0:01:05.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.043) 0:01:05.572 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:38:19 +0000 (0:00:00.030) 0:01:05.603 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:38:21 +0000 (0:00:01.841) 0:01:07.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:38:21 +0000 (0:00:00.035) 0:01:07.480 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:38:21 +0000 (0:00:00.029) 0:01:07.509 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:38:21 +0000 (0:00:00.043) 0:01:07.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:38:21 +0000 (0:00:00.041) 0:01:07.594 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:38:21 +0000 (0:00:00.033) 0:01:07.628 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:38:21 +0000 (0:00:00.382) 0:01:08.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:38:22 +0000 (0:00:00.630) 0:01:08.641 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:38:22 +0000 (0:00:00.030) 0:01:08.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:38:23 +0000 (0:00:00.663) 0:01:09.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:38:23 +0000 (0:00:00.371) 0:01:09.707 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:38:23 +0000 (0:00:00.030) 0:01:09.738 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:100 Wednesday 01 June 2022 16:38:24 +0000 (0:00:00.896) 0:01:10.634 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:38:24 +0000 (0:00:00.100) 0:01:10.734 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:38:24 +0000 (0:00:00.041) 0:01:10.776 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:38:24 +0000 (0:00:00.028) 0:01:10.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:38:24 +0000 (0:00:00.375) 0:01:11.179 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002720", "end": "2022-06-01 12:38:24.701655", "rc": 0, "start": "2022-06-01 12:38:24.698935" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.375) 0:01:11.555 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002667", "end": "2022-06-01 12:38:25.071120", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:38:25.068453" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.364) 0:01:11.919 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.070) 0:01:11.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.035) 0:01:12.025 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.062) 0:01:12.088 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.040) 0:01:12.129 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.028) 0:01:12.157 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.029) 0:01:12.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.037) 0:01:12.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:38:25 +0000 (0:00:00.034) 0:01:12.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.035) 0:01:12.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.031) 0:01:12.326 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.028) 0:01:12.354 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.057) 0:01:12.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.030) 0:01:12.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.030) 0:01:12.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.030) 0:01:12.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.029) 0:01:12.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.030) 0:01:12.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.028) 0:01:12.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.029) 0:01:12.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.030) 0:01:12.653 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.058) 0:01:12.712 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.079) 0:01:12.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.032) 0:01:12.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.030) 0:01:12.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.033) 0:01:12.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.038) 0:01:12.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.030) 0:01:12.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.029) 0:01:12.987 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.060) 0:01:13.047 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.037) 0:01:13.085 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.029) 0:01:13.114 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.029) 0:01:13.143 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.031) 0:01:13.175 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:38:26 +0000 (0:00:00.061) 0:01:13.236 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.138) 0:01:13.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.030) 0:01:13.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.027) 0:01:13.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.029) 0:01:13.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.029) 0:01:13.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.029) 0:01:13.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.028) 0:01:13.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.027) 0:01:13.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.030) 0:01:13.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.030) 0:01:13.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.029) 0:01:13.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.030) 0:01:13.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.028) 0:01:13.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.027) 0:01:13.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.027) 0:01:13.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.046) 0:01:13.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.030) 0:01:13.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.029) 0:01:13.889 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.069) 0:01:13.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.038) 0:01:13.997 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.125) 0:01:14.123 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.037) 0:01:14.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.047) 0:01:14.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.030) 0:01:14.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:38:27 +0000 (0:00:00.037) 0:01:14.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.032) 0:01:14.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.030) 0:01:14.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.030) 0:01:14.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.027) 0:01:14.397 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.028) 0:01:14.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.045) 0:01:14.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.028) 0:01:14.500 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.035) 0:01:14.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.029) 0:01:14.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.029) 0:01:14.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.029) 0:01:14.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.027) 0:01:14.651 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.379) 0:01:15.031 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.039) 0:01:15.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.028) 0:01:15.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.034) 0:01:15.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.030) 0:01:15.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.027) 0:01:15.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.030) 0:01:15.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:38:28 +0000 (0:00:00.030) 0:01:15.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.035) 0:01:15.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.026) 0:01:15.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.029) 0:01:15.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.029) 0:01:15.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.028) 0:01:15.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.028) 0:01:15.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.083) 0:01:15.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.039) 0:01:15.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.034) 0:01:15.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.029) 0:01:15.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.028) 0:01:15.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.029) 0:01:15.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.032) 0:01:15.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:15.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:15.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:15.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:15.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:15.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.032) 0:01:15.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:15.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:15.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.029) 0:01:15.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:16.014 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.032) 0:01:16.047 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.031) 0:01:16.079 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:16.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.032) 0:01:16.142 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.029) 0:01:16.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.030) 0:01:16.202 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.033) 0:01:16.236 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:38:29 +0000 (0:00:00.036) 0:01:16.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.031) 0:01:16.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.030) 0:01:16.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.030) 0:01:16.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.029) 0:01:16.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.029) 0:01:16.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.032) 0:01:16.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.029) 0:01:16.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.030) 0:01:16.516 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.030) 0:01:16.547 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.034) 0:01:16.581 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.122) 0:01:16.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.034) 0:01:16.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.042) 0:01:16.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.032) 0:01:16.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.036) 0:01:16.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.030) 0:01:16.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.029) 0:01:16.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.030) 0:01:16.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.031) 0:01:16.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.031) 0:01:17.003 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.044) 0:01:17.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.026) 0:01:17.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.035) 0:01:17.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.029) 0:01:17.139 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.032) 0:01:17.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.028) 0:01:17.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:38:30 +0000 (0:00:00.025) 0:01:17.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.368) 0:01:17.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.037) 0:01:17.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.025) 0:01:17.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.032) 0:01:17.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.031) 0:01:17.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.082) 0:01:17.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.033) 0:01:17.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.030) 0:01:17.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.030) 0:01:17.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.024) 0:01:17.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.030) 0:01:17.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.033) 0:01:17.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.031) 0:01:18.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.030) 0:01:18.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.031) 0:01:18.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.037) 0:01:18.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.035) 0:01:18.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.034) 0:01:18.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.031) 0:01:18.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:38:31 +0000 (0:00:00.031) 0:01:18.252 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.032) 0:01:18.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.031) 0:01:18.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.032) 0:01:18.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.036) 0:01:18.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:18.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:18.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.029) 0:01:18.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.033) 0:01:18.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.031) 0:01:18.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.033) 0:01:18.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:18.604 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.033) 0:01:18.638 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.029) 0:01:18.668 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:18.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:18.729 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.032) 0:01:18.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.029) 0:01:18.791 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.042) 0:01:18.833 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.032) 0:01:18.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.031) 0:01:18.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.029) 0:01:18.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.034) 0:01:18.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:18.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.029) 0:01:19.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:19.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.031) 0:01:19.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.032) 0:01:19.117 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.041) 0:01:19.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.030) 0:01:19.190 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.028) 0:01:19.218 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=515 changed=6 unreachable=0 failed=0 skipped=509 rescued=0 ignored=0 Wednesday 01 June 2022 16:38:32 +0000 (0:00:00.016) 0:01:19.234 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.35s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.10s /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove_scsi_generated.yml:3 - linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : set up new/current mounts ------------------ 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml:2 ---------------- linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:38:33 +0000 (0:00:00.021) 0:00:00.021 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:38:34 +0000 (0:00:01.266) 0:00:01.288 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvmvdo_then_remove.yml ********************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:2 Wednesday 01 June 2022 16:38:35 +0000 (0:00:00.019) 0:00:01.308 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:15 Wednesday 01 June 2022 16:38:36 +0000 (0:00:01.106) 0:00:02.414 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:38:36 +0000 (0:00:00.037) 0:00:02.452 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:38:36 +0000 (0:00:00.153) 0:00:02.605 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:38:36 +0000 (0:00:00.557) 0:00:03.163 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:38:36 +0000 (0:00:00.077) 0:00:03.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:38:36 +0000 (0:00:00.023) 0:00:03.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:38:36 +0000 (0:00:00.022) 0:00:03.286 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:38:37 +0000 (0:00:00.195) 0:00:03.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:38:37 +0000 (0:00:00.019) 0:00:03.500 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:38:38 +0000 (0:00:01.077) 0:00:04.578 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:38:38 +0000 (0:00:00.045) 0:00:04.624 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:38:38 +0000 (0:00:00.045) 0:00:04.669 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:38:39 +0000 (0:00:00.678) 0:00:05.348 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:38:39 +0000 (0:00:00.079) 0:00:05.428 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:38:39 +0000 (0:00:00.020) 0:00:05.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:38:39 +0000 (0:00:00.022) 0:00:05.471 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:38:39 +0000 (0:00:00.020) 0:00:05.491 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:38:39 +0000 (0:00:00.790) 0:00:06.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:38:41 +0000 (0:00:01.779) 0:00:08.062 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:38:41 +0000 (0:00:00.043) 0:00:08.105 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:38:41 +0000 (0:00:00.027) 0:00:08.132 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.502) 0:00:08.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.030) 0:00:08.665 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.026) 0:00:08.691 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.032) 0:00:08.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.031) 0:00:08.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.032) 0:00:08.788 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.063) 0:00:08.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.034) 0:00:08.886 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.028) 0:00:08.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:38:42 +0000 (0:00:00.028) 0:00:08.944 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:38:43 +0000 (0:00:00.473) 0:00:09.418 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:38:43 +0000 (0:00:00.027) 0:00:09.446 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:18 Wednesday 01 June 2022 16:38:43 +0000 (0:00:00.798) 0:00:10.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [Gather package facts] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:25 Wednesday 01 June 2022 16:38:43 +0000 (0:00:00.029) 0:00:10.274 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "packages": { "NetworkManager": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "NetworkManager-libnm": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager-libnm", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "NetworkManager-team": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager-team", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "NetworkManager-tui": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager-tui", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "PackageKit": [ { "arch": "x86_64", "epoch": null, "name": "PackageKit", "release": "2.el9", "source": "rpm", "version": "1.2.4" } ], "PackageKit-glib": [ { "arch": "x86_64", "epoch": null, "name": "PackageKit-glib", "release": "2.el9", "source": "rpm", "version": "1.2.4" } ], "abattis-cantarell-fonts": [ { "arch": "noarch", "epoch": null, "name": "abattis-cantarell-fonts", "release": "4.el9", "source": "rpm", "version": "0.301" } ], "acl": [ { "arch": "x86_64", "epoch": null, "name": "acl", "release": "3.el9", "source": "rpm", "version": "2.3.1" } ], "alternatives": [ { "arch": "x86_64", "epoch": null, "name": "alternatives", "release": "2.el9", "source": "rpm", "version": "1.20" } ], "audit": [ { "arch": "x86_64", "epoch": null, "name": "audit", "release": "102.el9", "source": "rpm", "version": "3.0.7" } ], "audit-libs": [ { "arch": "x86_64", "epoch": null, "name": "audit-libs", "release": "102.el9", "source": "rpm", "version": "3.0.7" } ], "authselect": [ { "arch": "x86_64", "epoch": null, "name": "authselect", "release": "1.el9", "source": "rpm", "version": "1.2.5" } ], "authselect-compat": [ { "arch": "x86_64", "epoch": null, "name": "authselect-compat", "release": "1.el9", "source": "rpm", "version": "1.2.5" } ], "authselect-libs": [ { "arch": "x86_64", "epoch": null, "name": "authselect-libs", "release": "1.el9", "source": "rpm", "version": "1.2.5" } ], "basesystem": [ { "arch": "noarch", "epoch": null, "name": "basesystem", "release": "13.el9", "source": "rpm", "version": "11" } ], "bash": [ { "arch": "x86_64", "epoch": null, "name": "bash", "release": "4.el9", "source": "rpm", "version": "5.1.8" } ], "blivet-data": [ { "arch": "noarch", "epoch": 1, "name": "blivet-data", "release": "13.el9_0", "source": "rpm", "version": "3.4.0" } ], "bubblewrap": [ { "arch": "x86_64", "epoch": null, "name": "bubblewrap", "release": "6.el9", "source": "rpm", "version": "0.4.1" } ], "bzip2": [ { "arch": "x86_64", "epoch": null, "name": "bzip2", "release": "8.el9", "source": "rpm", "version": "1.0.8" } ], "bzip2-libs": [ { "arch": "x86_64", "epoch": null, "name": "bzip2-libs", "release": "8.el9", "source": "rpm", "version": "1.0.8" } ], "c-ares": [ { "arch": "x86_64", "epoch": null, "name": "c-ares", "release": "5.el9", "source": "rpm", "version": "1.17.1" } ], "ca-certificates": [ { "arch": "noarch", "epoch": null, "name": "ca-certificates", "release": "94.el9", "source": "rpm", "version": "2020.2.50" } ], "checkpolicy": [ { "arch": "x86_64", "epoch": null, "name": "checkpolicy", "release": "1.el9", "source": "rpm", "version": "3.3" } ], "chrony": [ { "arch": "x86_64", "epoch": null, "name": "chrony", "release": "1.el9", "source": "rpm", "version": "4.2" } ], "cloud-init": [ { "arch": "noarch", "epoch": null, "name": "cloud-init", "release": "1.el9", "source": "rpm", "version": "22.1" } ], "cloud-utils-growpart": [ { "arch": "x86_64", "epoch": null, "name": "cloud-utils-growpart", "release": "10.el9", "source": "rpm", "version": "0.31" } ], "cockpit-bridge": [ { "arch": "x86_64", "epoch": null, "name": "cockpit-bridge", "release": "1.el9", "source": "rpm", "version": "269" } ], "cockpit-system": [ { "arch": "noarch", "epoch": null, "name": "cockpit-system", "release": "1.el9", "source": "rpm", "version": "269" } ], "cockpit-ws": [ { "arch": "x86_64", "epoch": null, "name": "cockpit-ws", "release": "1.el9", "source": "rpm", "version": "269" } ], "coreutils": [ { "arch": "x86_64", "epoch": null, "name": "coreutils", "release": "31.el9", "source": "rpm", "version": "8.32" } ], "coreutils-common": [ { "arch": "x86_64", "epoch": null, "name": "coreutils-common", "release": "31.el9", "source": "rpm", "version": "8.32" } ], "cpio": [ { "arch": "x86_64", "epoch": null, "name": "cpio", "release": "16.el9", "source": "rpm", "version": "2.13" } ], "cracklib": [ { "arch": "x86_64", "epoch": null, "name": "cracklib", "release": "27.el9", "source": "rpm", "version": "2.9.6" } ], "cracklib-dicts": [ { "arch": "x86_64", "epoch": null, "name": "cracklib-dicts", "release": "27.el9", "source": "rpm", "version": "2.9.6" } ], "cronie": [ { "arch": "x86_64", "epoch": null, "name": "cronie", "release": "5.el9", "source": "rpm", "version": "1.5.7" } ], "cronie-anacron": [ { "arch": "x86_64", "epoch": null, "name": "cronie-anacron", "release": "5.el9", "source": "rpm", "version": "1.5.7" } ], "crontabs": [ { "arch": "noarch", "epoch": null, "name": "crontabs", "release": "27.20190603git.el9_0", "source": "rpm", "version": "1.11" } ], "crypto-policies": [ { "arch": "noarch", "epoch": null, "name": "crypto-policies", "release": "1.gitb2323a1.el9", "source": "rpm", "version": "20220427" } ], "crypto-policies-scripts": [ { "arch": "noarch", "epoch": null, "name": "crypto-policies-scripts", "release": "1.gitb2323a1.el9", "source": "rpm", "version": "20220427" } ], "cryptsetup-libs": [ { "arch": "x86_64", "epoch": null, "name": "cryptsetup-libs", "release": "4.el9", "source": "rpm", "version": "2.4.3" } ], "curl": [ { "arch": "x86_64", "epoch": null, "name": "curl", "release": "18.el9", "source": "rpm", "version": "7.76.1" } ], "cyrus-sasl-lib": [ { "arch": "x86_64", "epoch": null, "name": "cyrus-sasl-lib", "release": "20.el9", "source": "rpm", "version": "2.1.27" } ], "daxctl-libs": [ { "arch": "x86_64", "epoch": null, "name": "daxctl-libs", "release": "6.el9", "source": "rpm", "version": "71.1" } ], "dbus": [ { "arch": "x86_64", "epoch": 1, "name": "dbus", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "dbus-broker": [ { "arch": "x86_64", "epoch": null, "name": "dbus-broker", "release": "5.el9", "source": "rpm", "version": "28" } ], "dbus-common": [ { "arch": "noarch", "epoch": 1, "name": "dbus-common", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "dbus-libs": [ { "arch": "x86_64", "epoch": 1, "name": "dbus-libs", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "dbus-tools": [ { "arch": "x86_64", "epoch": 1, "name": "dbus-tools", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "device-mapper": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-event": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper-event", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-event-libs": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper-event-libs", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-libs": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper-libs", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-multipath": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-multipath", "release": "9.el9", "source": "rpm", "version": "0.8.7" } ], "device-mapper-multipath-libs": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-multipath-libs", "release": "9.el9", "source": "rpm", "version": "0.8.7" } ], "device-mapper-persistent-data": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-persistent-data", "release": "12.el9", "source": "rpm", "version": "0.9.0" } ], "dhcp-client": [ { "arch": "x86_64", "epoch": 12, "name": "dhcp-client", "release": "15.b1.el9", "source": "rpm", "version": "4.4.2" } ], "dhcp-common": [ { "arch": "noarch", "epoch": 12, "name": "dhcp-common", "release": "15.b1.el9", "source": "rpm", "version": "4.4.2" } ], "diffutils": [ { "arch": "x86_64", "epoch": null, "name": "diffutils", "release": "12.el9", "source": "rpm", "version": "3.7" } ], "dmidecode": [ { "arch": "x86_64", "epoch": 1, "name": "dmidecode", "release": "7.el9", "source": "rpm", "version": "3.3" } ], "dnf": [ { "arch": "noarch", "epoch": null, "name": "dnf", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "dnf-data": [ { "arch": "noarch", "epoch": null, "name": "dnf-data", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "dnf-plugins-core": [ { "arch": "noarch", "epoch": null, "name": "dnf-plugins-core", "release": "1.el9", "source": "rpm", "version": "4.1.0" } ], "dosfstools": [ { "arch": "x86_64", "epoch": null, "name": "dosfstools", "release": "3.el9", "source": "rpm", "version": "4.2" } ], "dracut": [ { "arch": "x86_64", "epoch": null, "name": "dracut", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "dracut-config-generic": [ { "arch": "x86_64", "epoch": null, "name": "dracut-config-generic", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "dracut-network": [ { "arch": "x86_64", "epoch": null, "name": "dracut-network", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "dracut-squash": [ { "arch": "x86_64", "epoch": null, "name": "dracut-squash", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "e2fsprogs": [ { "arch": "x86_64", "epoch": null, "name": "e2fsprogs", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "e2fsprogs-libs": [ { "arch": "x86_64", "epoch": null, "name": "e2fsprogs-libs", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "efi-filesystem": [ { "arch": "noarch", "epoch": null, "name": "efi-filesystem", "release": "2.el9_0", "source": "rpm", "version": "6" } ], "efibootmgr": [ { "arch": "x86_64", "epoch": null, "name": "efibootmgr", "release": "12.el9", "source": "rpm", "version": "16" } ], "efivar-libs": [ { "arch": "x86_64", "epoch": null, "name": "efivar-libs", "release": "2.el9", "source": "rpm", "version": "38" } ], "elfutils-default-yama-scope": [ { "arch": "noarch", "epoch": null, "name": "elfutils-default-yama-scope", "release": "4.el9", "source": "rpm", "version": "0.187" } ], "elfutils-libelf": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-libelf", "release": "4.el9", "source": "rpm", "version": "0.187" } ], "elfutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-libs", "release": "4.el9", "source": "rpm", "version": "0.187" } ], "ethtool": [ { "arch": "x86_64", "epoch": 2, "name": "ethtool", "release": "1.el9", "source": "rpm", "version": "5.16" } ], "expat": [ { "arch": "x86_64", "epoch": null, "name": "expat", "release": "1.el9", "source": "rpm", "version": "2.4.7" } ], "file": [ { "arch": "x86_64", "epoch": null, "name": "file", "release": "8.el9", "source": "rpm", "version": "5.39" } ], "file-libs": [ { "arch": "x86_64", "epoch": null, "name": "file-libs", "release": "8.el9", "source": "rpm", "version": "5.39" } ], "filesystem": [ { "arch": "x86_64", "epoch": null, "name": "filesystem", "release": "2.el9", "source": "rpm", "version": "3.16" } ], "findutils": [ { "arch": "x86_64", "epoch": 1, "name": "findutils", "release": "5.el9", "source": "rpm", "version": "4.8.0" } ], "flashrom": [ { "arch": "x86_64", "epoch": null, "name": "flashrom", "release": "10.el9", "source": "rpm", "version": "1.2" } ], "fonts-filesystem": [ { "arch": "noarch", "epoch": 1, "name": "fonts-filesystem", "release": "7.el9.1", "source": "rpm", "version": "2.0.5" } ], "fuse-libs": [ { "arch": "x86_64", "epoch": null, "name": "fuse-libs", "release": "15.el9", "source": "rpm", "version": "2.9.9" } ], "fwupd": [ { "arch": "x86_64", "epoch": null, "name": "fwupd", "release": "2.el9_0", "source": "rpm", "version": "1.7.4" } ], "fwupd-plugin-flashrom": [ { "arch": "x86_64", "epoch": null, "name": "fwupd-plugin-flashrom", "release": "2.el9_0", "source": "rpm", "version": "1.7.4" } ], "gawk": [ { "arch": "x86_64", "epoch": null, "name": "gawk", "release": "6.el9", "source": "rpm", "version": "5.1.0" } ], "gawk-all-langpacks": [ { "arch": "x86_64", "epoch": null, "name": "gawk-all-langpacks", "release": "6.el9", "source": "rpm", "version": "5.1.0" } ], "gdbm-libs": [ { "arch": "x86_64", "epoch": 1, "name": "gdbm-libs", "release": "4.el9", "source": "rpm", "version": "1.19" } ], "gdisk": [ { "arch": "x86_64", "epoch": null, "name": "gdisk", "release": "5.el9", "source": "rpm", "version": "1.0.7" } ], "gdk-pixbuf2": [ { "arch": "x86_64", "epoch": null, "name": "gdk-pixbuf2", "release": "2.el9", "source": "rpm", "version": "2.42.6" } ], "geolite2-city": [ { "arch": "noarch", "epoch": null, "name": "geolite2-city", "release": "6.el9", "source": "rpm", "version": "20191217" } ], "geolite2-country": [ { "arch": "noarch", "epoch": null, "name": "geolite2-country", "release": "6.el9", "source": "rpm", "version": "20191217" } ], "gettext": [ { "arch": "x86_64", "epoch": null, "name": "gettext", "release": "7.el9", "source": "rpm", "version": "0.21" } ], "gettext-libs": [ { "arch": "x86_64", "epoch": null, "name": "gettext-libs", "release": "7.el9", "source": "rpm", "version": "0.21" } ], "glib-networking": [ { "arch": "x86_64", "epoch": null, "name": "glib-networking", "release": "3.el9", "source": "rpm", "version": "2.68.3" } ], "glib2": [ { "arch": "x86_64", "epoch": null, "name": "glib2", "release": "5.el9", "source": "rpm", "version": "2.68.4" } ], "glibc": [ { "arch": "x86_64", "epoch": null, "name": "glibc", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "glibc-common": [ { "arch": "x86_64", "epoch": null, "name": "glibc-common", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "glibc-gconv-extra": [ { "arch": "x86_64", "epoch": null, "name": "glibc-gconv-extra", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "glibc-minimal-langpack": [ { "arch": "x86_64", "epoch": null, "name": "glibc-minimal-langpack", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "gmp": [ { "arch": "x86_64", "epoch": 1, "name": "gmp", "release": "10.el9", "source": "rpm", "version": "6.2.0" } ], "gnupg2": [ { "arch": "x86_64", "epoch": null, "name": "gnupg2", "release": "1.el9", "source": "rpm", "version": "2.3.3" } ], "gnutls": [ { "arch": "x86_64", "epoch": null, "name": "gnutls", "release": "9.el9", "source": "rpm", "version": "3.7.3" } ], "gobject-introspection": [ { "arch": "x86_64", "epoch": null, "name": "gobject-introspection", "release": "10.el9", "source": "rpm", "version": "1.68.0" } ], "gpgme": [ { "arch": "x86_64", "epoch": null, "name": "gpgme", "release": "6.el9", "source": "rpm", "version": "1.15.1" } ], "grep": [ { "arch": "x86_64", "epoch": null, "name": "grep", "release": "5.el9", "source": "rpm", "version": "3.6" } ], "groff-base": [ { "arch": "x86_64", "epoch": null, "name": "groff-base", "release": "10.el9", "source": "rpm", "version": "1.22.4" } ], "grub2-common": [ { "arch": "noarch", "epoch": 1, "name": "grub2-common", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-efi-x64": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-efi-x64", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-pc": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-pc", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-pc-modules": [ { "arch": "noarch", "epoch": 1, "name": "grub2-pc-modules", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-tools": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-tools", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-tools-minimal": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-tools-minimal", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grubby": [ { "arch": "x86_64", "epoch": null, "name": "grubby", "release": "55.el9", "source": "rpm", "version": "8.40" } ], "gsettings-desktop-schemas": [ { "arch": "x86_64", "epoch": null, "name": "gsettings-desktop-schemas", "release": "4.el9", "source": "rpm", "version": "40.0" } ], "gssproxy": [ { "arch": "x86_64", "epoch": null, "name": "gssproxy", "release": "4.el9", "source": "rpm", "version": "0.8.4" } ], "gzip": [ { "arch": "x86_64", "epoch": null, "name": "gzip", "release": "1.el9", "source": "rpm", "version": "1.12" } ], "hdparm": [ { "arch": "x86_64", "epoch": null, "name": "hdparm", "release": "2.el9", "source": "rpm", "version": "9.62" } ], "hostname": [ { "arch": "x86_64", "epoch": null, "name": "hostname", "release": "6.el9", "source": "rpm", "version": "3.23" } ], "hwdata": [ { "arch": "noarch", "epoch": null, "name": "hwdata", "release": "9.3.el9", "source": "rpm", "version": "0.348" } ], "ima-evm-utils": [ { "arch": "x86_64", "epoch": null, "name": "ima-evm-utils", "release": "4.el9", "source": "rpm", "version": "1.4" } ], "inih": [ { "arch": "x86_64", "epoch": null, "name": "inih", "release": "5.el9", "source": "rpm", "version": "49" } ], "initscripts-service": [ { "arch": "noarch", "epoch": null, "name": "initscripts-service", "release": "1.el9", "source": "rpm", "version": "10.11.4" } ], "insights-client": [ { "arch": "noarch", "epoch": 0, "name": "insights-client", "release": "8.el9", "source": "rpm", "version": "3.1.7" } ], "ipcalc": [ { "arch": "x86_64", "epoch": null, "name": "ipcalc", "release": "5.el9", "source": "rpm", "version": "1.0.0" } ], "iproute": [ { "arch": "x86_64", "epoch": null, "name": "iproute", "release": "2.2.el9_0", "source": "rpm", "version": "5.15.0" } ], "iproute-tc": [ { "arch": "x86_64", "epoch": null, "name": "iproute-tc", "release": "2.2.el9_0", "source": "rpm", "version": "5.15.0" } ], "iptables-libs": [ { "arch": "x86_64", "epoch": null, "name": "iptables-libs", "release": "28.el9", "source": "rpm", "version": "1.8.7" } ], "iputils": [ { "arch": "x86_64", "epoch": null, "name": "iputils", "release": "7.el9", "source": "rpm", "version": "20210202" } ], "irqbalance": [ { "arch": "x86_64", "epoch": 2, "name": "irqbalance", "release": "5.el9", "source": "rpm", "version": "1.8.0" } ], "jansson": [ { "arch": "x86_64", "epoch": null, "name": "jansson", "release": "1.el9", "source": "rpm", "version": "2.14" } ], "json-c": [ { "arch": "x86_64", "epoch": null, "name": "json-c", "release": "11.el9", "source": "rpm", "version": "0.14" } ], "json-glib": [ { "arch": "x86_64", "epoch": null, "name": "json-glib", "release": "1.el9", "source": "rpm", "version": "1.6.6" } ], "kbd": [ { "arch": "x86_64", "epoch": null, "name": "kbd", "release": "8.el9", "source": "rpm", "version": "2.4.0" } ], "kbd-misc": [ { "arch": "noarch", "epoch": null, "name": "kbd-misc", "release": "8.el9", "source": "rpm", "version": "2.4.0" } ], "kernel": [ { "arch": "x86_64", "epoch": null, "name": "kernel", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-core": [ { "arch": "x86_64", "epoch": null, "name": "kernel-core", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-modules": [ { "arch": "x86_64", "epoch": null, "name": "kernel-modules", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-tools": [ { "arch": "x86_64", "epoch": null, "name": "kernel-tools", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-tools-libs": [ { "arch": "x86_64", "epoch": null, "name": "kernel-tools-libs", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kexec-tools": [ { "arch": "x86_64", "epoch": null, "name": "kexec-tools", "release": "2.el9", "source": "rpm", "version": "2.0.24" } ], "keyutils": [ { "arch": "x86_64", "epoch": null, "name": "keyutils", "release": "4.el9", "source": "rpm", "version": "1.6.1" } ], "keyutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "keyutils-libs", "release": "4.el9", "source": "rpm", "version": "1.6.1" } ], "kmod": [ { "arch": "x86_64", "epoch": null, "name": "kmod", "release": "7.el9", "source": "rpm", "version": "28" } ], "kmod-kvdo": [ { "arch": "x86_64", "epoch": null, "name": "kmod-kvdo", "release": "24.el9_0", "source": "rpm", "version": "8.1.1.371" } ], "kmod-libs": [ { "arch": "x86_64", "epoch": null, "name": "kmod-libs", "release": "7.el9", "source": "rpm", "version": "28" } ], "kpartx": [ { "arch": "x86_64", "epoch": null, "name": "kpartx", "release": "9.el9", "source": "rpm", "version": "0.8.7" } ], "krb5-libs": [ { "arch": "x86_64", "epoch": null, "name": "krb5-libs", "release": "18.el9", "source": "rpm", "version": "1.19.1" } ], "less": [ { "arch": "x86_64", "epoch": null, "name": "less", "release": "1.el9_0", "source": "rpm", "version": "590" } ], "libacl": [ { "arch": "x86_64", "epoch": null, "name": "libacl", "release": "3.el9", "source": "rpm", "version": "2.3.1" } ], "libaio": [ { "arch": "x86_64", "epoch": null, "name": "libaio", "release": "13.el9", "source": "rpm", "version": "0.3.111" } ], "libappstream-glib": [ { "arch": "x86_64", "epoch": null, "name": "libappstream-glib", "release": "4.el9", "source": "rpm", "version": "0.7.18" } ], "libarchive": [ { "arch": "x86_64", "epoch": null, "name": "libarchive", "release": "2.el9_0", "source": "rpm", "version": "3.5.3" } ], "libassuan": [ { "arch": "x86_64", "epoch": null, "name": "libassuan", "release": "3.el9", "source": "rpm", "version": "2.5.5" } ], "libattr": [ { "arch": "x86_64", "epoch": null, "name": "libattr", "release": "3.el9", "source": "rpm", "version": "2.5.1" } ], "libbasicobjects": [ { "arch": "x86_64", "epoch": null, "name": "libbasicobjects", "release": "53.el9", "source": "rpm", "version": "0.1.1" } ], "libblkid": [ { "arch": "x86_64", "epoch": null, "name": "libblkid", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libblockdev": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-crypto": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-crypto", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-dm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-dm", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-fs": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-fs", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-kbd": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-kbd", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-loop": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-loop", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-lvm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-lvm", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-mdraid": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-mdraid", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-mpath": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-mpath", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-nvdimm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-nvdimm", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-part": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-part", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-swap": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-swap", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-utils": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-utils", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libbpf": [ { "arch": "x86_64", "epoch": 2, "name": "libbpf", "release": "4.el9", "source": "rpm", "version": "0.5.0" } ], "libbrotli": [ { "arch": "x86_64", "epoch": null, "name": "libbrotli", "release": "6.el9", "source": "rpm", "version": "1.0.9" } ], "libbytesize": [ { "arch": "x86_64", "epoch": null, "name": "libbytesize", "release": "3.el9", "source": "rpm", "version": "2.5" } ], "libcap": [ { "arch": "x86_64", "epoch": null, "name": "libcap", "release": "8.el9", "source": "rpm", "version": "2.48" } ], "libcap-ng": [ { "arch": "x86_64", "epoch": null, "name": "libcap-ng", "release": "7.el9", "source": "rpm", "version": "0.8.2" } ], "libcbor": [ { "arch": "x86_64", "epoch": null, "name": "libcbor", "release": "5.el9", "source": "rpm", "version": "0.7.0" } ], "libcollection": [ { "arch": "x86_64", "epoch": null, "name": "libcollection", "release": "53.el9", "source": "rpm", "version": "0.7.0" } ], "libcom_err": [ { "arch": "x86_64", "epoch": null, "name": "libcom_err", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "libcomps": [ { "arch": "x86_64", "epoch": null, "name": "libcomps", "release": "1.el9", "source": "rpm", "version": "0.1.18" } ], "libcurl": [ { "arch": "x86_64", "epoch": null, "name": "libcurl", "release": "18.el9", "source": "rpm", "version": "7.76.1" } ], "libdaemon": [ { "arch": "x86_64", "epoch": null, "name": "libdaemon", "release": "23.el9", "source": "rpm", "version": "0.14" } ], "libdb": [ { "arch": "x86_64", "epoch": null, "name": "libdb", "release": "53.el9", "source": "rpm", "version": "5.3.28" } ], "libdhash": [ { "arch": "x86_64", "epoch": null, "name": "libdhash", "release": "53.el9", "source": "rpm", "version": "0.5.0" } ], "libdnf": [ { "arch": "x86_64", "epoch": null, "name": "libdnf", "release": "1.el9", "source": "rpm", "version": "0.67.0" } ], "libdnf-plugin-subscription-manager": [ { "arch": "x86_64", "epoch": null, "name": "libdnf-plugin-subscription-manager", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "libeconf": [ { "arch": "x86_64", "epoch": null, "name": "libeconf", "release": "2.el9", "source": "rpm", "version": "0.4.1" } ], "libedit": [ { "arch": "x86_64", "epoch": null, "name": "libedit", "release": "37.20210216cvs.el9", "source": "rpm", "version": "3.1" } ], "libestr": [ { "arch": "x86_64", "epoch": null, "name": "libestr", "release": "4.el9", "source": "rpm", "version": "0.1.11" } ], "libev": [ { "arch": "x86_64", "epoch": null, "name": "libev", "release": "5.el9", "source": "rpm", "version": "4.33" } ], "libevent": [ { "arch": "x86_64", "epoch": null, "name": "libevent", "release": "6.el9", "source": "rpm", "version": "2.1.12" } ], "libfastjson": [ { "arch": "x86_64", "epoch": null, "name": "libfastjson", "release": "3.el9", "source": "rpm", "version": "0.99.9" } ], "libfdisk": [ { "arch": "x86_64", "epoch": null, "name": "libfdisk", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libffi": [ { "arch": "x86_64", "epoch": null, "name": "libffi", "release": "7.el9", "source": "rpm", "version": "3.4.2" } ], "libfido2": [ { "arch": "x86_64", "epoch": null, "name": "libfido2", "release": "7.el9", "source": "rpm", "version": "1.6.0" } ], "libgcab1": [ { "arch": "x86_64", "epoch": null, "name": "libgcab1", "release": "6.el9", "source": "rpm", "version": "1.4" } ], "libgcc": [ { "arch": "x86_64", "epoch": null, "name": "libgcc", "release": "2.el9", "source": "rpm", "version": "11.3.1" } ], "libgcrypt": [ { "arch": "x86_64", "epoch": null, "name": "libgcrypt", "release": "4.el9_0", "source": "rpm", "version": "1.10.0" } ], "libgomp": [ { "arch": "x86_64", "epoch": null, "name": "libgomp", "release": "2.el9", "source": "rpm", "version": "11.3.1" } ], "libgpg-error": [ { "arch": "x86_64", "epoch": null, "name": "libgpg-error", "release": "5.el9", "source": "rpm", "version": "1.42" } ], "libgudev": [ { "arch": "x86_64", "epoch": null, "name": "libgudev", "release": "1.el9", "source": "rpm", "version": "237" } ], "libgusb": [ { "arch": "x86_64", "epoch": null, "name": "libgusb", "release": "1.el9", "source": "rpm", "version": "0.3.8" } ], "libibverbs": [ { "arch": "x86_64", "epoch": null, "name": "libibverbs", "release": "1.el9", "source": "rpm", "version": "37.2" } ], "libicu": [ { "arch": "x86_64", "epoch": null, "name": "libicu", "release": "9.el9", "source": "rpm", "version": "67.1" } ], "libidn2": [ { "arch": "x86_64", "epoch": null, "name": "libidn2", "release": "7.el9", "source": "rpm", "version": "2.3.0" } ], "libini_config": [ { "arch": "x86_64", "epoch": null, "name": "libini_config", "release": "53.el9", "source": "rpm", "version": "1.3.1" } ], "libjcat": [ { "arch": "x86_64", "epoch": null, "name": "libjcat", "release": "3.el9", "source": "rpm", "version": "0.1.6" } ], "libjpeg-turbo": [ { "arch": "x86_64", "epoch": null, "name": "libjpeg-turbo", "release": "5.el9", "source": "rpm", "version": "2.0.90" } ], "libkcapi": [ { "arch": "x86_64", "epoch": null, "name": "libkcapi", "release": "3.el9", "source": "rpm", "version": "1.3.1" } ], "libkcapi-hmaccalc": [ { "arch": "x86_64", "epoch": null, "name": "libkcapi-hmaccalc", "release": "3.el9", "source": "rpm", "version": "1.3.1" } ], "libksba": [ { "arch": "x86_64", "epoch": null, "name": "libksba", "release": "4.el9", "source": "rpm", "version": "1.5.1" } ], "libldb": [ { "arch": "x86_64", "epoch": null, "name": "libldb", "release": "1.el9", "source": "rpm", "version": "2.5.0" } ], "libmaxminddb": [ { "arch": "x86_64", "epoch": null, "name": "libmaxminddb", "release": "3.el9", "source": "rpm", "version": "1.5.2" } ], "libmnl": [ { "arch": "x86_64", "epoch": null, "name": "libmnl", "release": "15.el9", "source": "rpm", "version": "1.0.4" } ], "libmodulemd": [ { "arch": "x86_64", "epoch": null, "name": "libmodulemd", "release": "2.el9", "source": "rpm", "version": "2.13.0" } ], "libmount": [ { "arch": "x86_64", "epoch": null, "name": "libmount", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libndp": [ { "arch": "x86_64", "epoch": null, "name": "libndp", "release": "4.el9", "source": "rpm", "version": "1.8" } ], "libnetfilter_conntrack": [ { "arch": "x86_64", "epoch": null, "name": "libnetfilter_conntrack", "release": "4.el9", "source": "rpm", "version": "1.0.8" } ], "libnfnetlink": [ { "arch": "x86_64", "epoch": null, "name": "libnfnetlink", "release": "21.el9", "source": "rpm", "version": "1.0.1" } ], "libnfsidmap": [ { "arch": "x86_64", "epoch": 1, "name": "libnfsidmap", "release": "10.el9", "source": "rpm", "version": "2.5.4" } ], "libnghttp2": [ { "arch": "x86_64", "epoch": null, "name": "libnghttp2", "release": "5.el9", "source": "rpm", "version": "1.43.0" } ], "libnl3": [ { "arch": "x86_64", "epoch": null, "name": "libnl3", "release": "2.el9", "source": "rpm", "version": "3.6.0" } ], "libnl3-cli": [ { "arch": "x86_64", "epoch": null, "name": "libnl3-cli", "release": "2.el9", "source": "rpm", "version": "3.6.0" } ], "libpath_utils": [ { "arch": "x86_64", "epoch": null, "name": "libpath_utils", "release": "53.el9", "source": "rpm", "version": "0.2.1" } ], "libpcap": [ { "arch": "x86_64", "epoch": 14, "name": "libpcap", "release": "4.el9", "source": "rpm", "version": "1.10.0" } ], "libpipeline": [ { "arch": "x86_64", "epoch": null, "name": "libpipeline", "release": "4.el9", "source": "rpm", "version": "1.5.3" } ], "libpng": [ { "arch": "x86_64", "epoch": 2, "name": "libpng", "release": "12.el9", "source": "rpm", "version": "1.6.37" } ], "libproxy": [ { "arch": "x86_64", "epoch": null, "name": "libproxy", "release": "35.el9", "source": "rpm", "version": "0.4.15" } ], "libproxy-webkitgtk4": [ { "arch": "x86_64", "epoch": null, "name": "libproxy-webkitgtk4", "release": "35.el9", "source": "rpm", "version": "0.4.15" } ], "libpsl": [ { "arch": "x86_64", "epoch": null, "name": "libpsl", "release": "5.el9", "source": "rpm", "version": "0.21.1" } ], "libpwquality": [ { "arch": "x86_64", "epoch": null, "name": "libpwquality", "release": "8.el9", "source": "rpm", "version": "1.4.4" } ], "libref_array": [ { "arch": "x86_64", "epoch": null, "name": "libref_array", "release": "53.el9", "source": "rpm", "version": "0.1.5" } ], "librepo": [ { "arch": "x86_64", "epoch": null, "name": "librepo", "release": "1.el9", "source": "rpm", "version": "1.14.2" } ], "libreport-filesystem": [ { "arch": "noarch", "epoch": null, "name": "libreport-filesystem", "release": "6.el9", "source": "rpm", "version": "2.15.2" } ], "librhsm": [ { "arch": "x86_64", "epoch": null, "name": "librhsm", "release": "7.el9", "source": "rpm", "version": "0.0.3" } ], "libseccomp": [ { "arch": "x86_64", "epoch": null, "name": "libseccomp", "release": "2.el9", "source": "rpm", "version": "2.5.2" } ], "libselinux": [ { "arch": "x86_64", "epoch": null, "name": "libselinux", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "libselinux-utils": [ { "arch": "x86_64", "epoch": null, "name": "libselinux-utils", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "libsemanage": [ { "arch": "x86_64", "epoch": null, "name": "libsemanage", "release": "3.el9", "source": "rpm", "version": "3.3" } ], "libsepol": [ { "arch": "x86_64", "epoch": null, "name": "libsepol", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "libsigsegv": [ { "arch": "x86_64", "epoch": null, "name": "libsigsegv", "release": "4.el9", "source": "rpm", "version": "2.13" } ], "libsmartcols": [ { "arch": "x86_64", "epoch": null, "name": "libsmartcols", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libsmbios": [ { "arch": "x86_64", "epoch": null, "name": "libsmbios", "release": "4.el9", "source": "rpm", "version": "2.4.3" } ], "libsolv": [ { "arch": "x86_64", "epoch": null, "name": "libsolv", "release": "1.el9", "source": "rpm", "version": "0.7.22" } ], "libsoup": [ { "arch": "x86_64", "epoch": null, "name": "libsoup", "release": "8.el9", "source": "rpm", "version": "2.72.0" } ], "libss": [ { "arch": "x86_64", "epoch": null, "name": "libss", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "libssh": [ { "arch": "x86_64", "epoch": null, "name": "libssh", "release": "3.el9", "source": "rpm", "version": "0.9.6" } ], "libssh-config": [ { "arch": "noarch", "epoch": null, "name": "libssh-config", "release": "3.el9", "source": "rpm", "version": "0.9.6" } ], "libsss_certmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_certmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libsss_idmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_idmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libsss_nss_idmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_nss_idmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libsss_sudo": [ { "arch": "x86_64", "epoch": null, "name": "libsss_sudo", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libstdc++": [ { "arch": "x86_64", "epoch": null, "name": "libstdc++", "release": "2.el9", "source": "rpm", "version": "11.3.1" } ], "libstemmer": [ { "arch": "x86_64", "epoch": null, "name": "libstemmer", "release": "18.585svn.el9", "source": "rpm", "version": "0" } ], "libsysfs": [ { "arch": "x86_64", "epoch": null, "name": "libsysfs", "release": "10.el9", "source": "rpm", "version": "2.1.1" } ], "libtalloc": [ { "arch": "x86_64", "epoch": null, "name": "libtalloc", "release": "1.el9", "source": "rpm", "version": "2.3.3" } ], "libtasn1": [ { "arch": "x86_64", "epoch": null, "name": "libtasn1", "release": "7.el9", "source": "rpm", "version": "4.16.0" } ], "libtdb": [ { "arch": "x86_64", "epoch": null, "name": "libtdb", "release": "1.el9", "source": "rpm", "version": "1.4.6" } ], "libteam": [ { "arch": "x86_64", "epoch": null, "name": "libteam", "release": "11.el9", "source": "rpm", "version": "1.31" } ], "libtevent": [ { "arch": "x86_64", "epoch": null, "name": "libtevent", "release": "0.el9", "source": "rpm", "version": "0.12.0" } ], "libtirpc": [ { "arch": "x86_64", "epoch": null, "name": "libtirpc", "release": "1.el9", "source": "rpm", "version": "1.3.2" } ], "libunistring": [ { "arch": "x86_64", "epoch": null, "name": "libunistring", "release": "15.el9", "source": "rpm", "version": "0.9.10" } ], "libusbx": [ { "arch": "x86_64", "epoch": null, "name": "libusbx", "release": "1.el9", "source": "rpm", "version": "1.0.26" } ], "libuser": [ { "arch": "x86_64", "epoch": null, "name": "libuser", "release": "10.el9", "source": "rpm", "version": "0.63" } ], "libutempter": [ { "arch": "x86_64", "epoch": null, "name": "libutempter", "release": "6.el9", "source": "rpm", "version": "1.2.1" } ], "libuuid": [ { "arch": "x86_64", "epoch": null, "name": "libuuid", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libverto": [ { "arch": "x86_64", "epoch": null, "name": "libverto", "release": "3.el9", "source": "rpm", "version": "0.3.2" } ], "libverto-libev": [ { "arch": "x86_64", "epoch": null, "name": "libverto-libev", "release": "3.el9", "source": "rpm", "version": "0.3.2" } ], "libxcrypt": [ { "arch": "x86_64", "epoch": null, "name": "libxcrypt", "release": "3.el9", "source": "rpm", "version": "4.4.18" } ], "libxcrypt-compat": [ { "arch": "x86_64", "epoch": null, "name": "libxcrypt-compat", "release": "3.el9", "source": "rpm", "version": "4.4.18" } ], "libxml2": [ { "arch": "x86_64", "epoch": null, "name": "libxml2", "release": "2.el9", "source": "rpm", "version": "2.9.13" } ], "libxmlb": [ { "arch": "x86_64", "epoch": null, "name": "libxmlb", "release": "1.el9", "source": "rpm", "version": "0.3.3" } ], "libyaml": [ { "arch": "x86_64", "epoch": null, "name": "libyaml", "release": "7.el9", "source": "rpm", "version": "0.2.5" } ], "libzstd": [ { "arch": "x86_64", "epoch": null, "name": "libzstd", "release": "2.el9", "source": "rpm", "version": "1.5.1" } ], "linux-firmware": [ { "arch": "noarch", "epoch": null, "name": "linux-firmware", "release": "126.el9", "source": "rpm", "version": "20220509" } ], "linux-firmware-whence": [ { "arch": "noarch", "epoch": null, "name": "linux-firmware-whence", "release": "126.el9", "source": "rpm", "version": "20220509" } ], "lmdb-libs": [ { "arch": "x86_64", "epoch": null, "name": "lmdb-libs", "release": "3.el9", "source": "rpm", "version": "0.9.29" } ], "logrotate": [ { "arch": "x86_64", "epoch": null, "name": "logrotate", "release": "5.el9", "source": "rpm", "version": "3.18.0" } ], "lshw": [ { "arch": "x86_64", "epoch": null, "name": "lshw", "release": "7.el9", "source": "rpm", "version": "B.02.19.2" } ], "lsof": [ { "arch": "x86_64", "epoch": null, "name": "lsof", "release": "3.el9", "source": "rpm", "version": "4.94.0" } ], "lsscsi": [ { "arch": "x86_64", "epoch": null, "name": "lsscsi", "release": "6.el9", "source": "rpm", "version": "0.32" } ], "lua-libs": [ { "arch": "x86_64", "epoch": null, "name": "lua-libs", "release": "4.el9", "source": "rpm", "version": "5.4.2" } ], "lvm2": [ { "arch": "x86_64", "epoch": 9, "name": "lvm2", "release": "4.el9", "source": "rpm", "version": "2.03.14" } ], "lvm2-libs": [ { "arch": "x86_64", "epoch": 9, "name": "lvm2-libs", "release": "4.el9", "source": "rpm", "version": "2.03.14" } ], "lz4-libs": [ { "arch": "x86_64", "epoch": null, "name": "lz4-libs", "release": "5.el9", "source": "rpm", "version": "1.9.3" } ], "lzo": [ { "arch": "x86_64", "epoch": null, "name": "lzo", "release": "7.el9", "source": "rpm", "version": "2.10" } ], "man-db": [ { "arch": "x86_64", "epoch": null, "name": "man-db", "release": "6.el9", "source": "rpm", "version": "2.9.3" } ], "mdadm": [ { "arch": "x86_64", "epoch": null, "name": "mdadm", "release": "2.el9", "source": "rpm", "version": "4.2" } ], "microcode_ctl": [ { "arch": "noarch", "epoch": 4, "name": "microcode_ctl", "release": "1.el9", "source": "rpm", "version": "20220207" } ], "mokutil": [ { "arch": "x86_64", "epoch": 2, "name": "mokutil", "release": "9.el9", "source": "rpm", "version": "0.4.0" } ], "mpfr": [ { "arch": "x86_64", "epoch": null, "name": "mpfr", "release": "7.el9", "source": "rpm", "version": "4.1.0" } ], "ncurses": [ { "arch": "x86_64", "epoch": null, "name": "ncurses", "release": "8.20210508.el9", "source": "rpm", "version": "6.2" } ], "ncurses-base": [ { "arch": "noarch", "epoch": null, "name": "ncurses-base", "release": "8.20210508.el9", "source": "rpm", "version": "6.2" } ], "ncurses-libs": [ { "arch": "x86_64", "epoch": null, "name": "ncurses-libs", "release": "8.20210508.el9", "source": "rpm", "version": "6.2" } ], "ndctl": [ { "arch": "x86_64", "epoch": null, "name": "ndctl", "release": "6.el9", "source": "rpm", "version": "71.1" } ], "ndctl-libs": [ { "arch": "x86_64", "epoch": null, "name": "ndctl-libs", "release": "6.el9", "source": "rpm", "version": "71.1" } ], "nettle": [ { "arch": "x86_64", "epoch": null, "name": "nettle", "release": "2.el9", "source": "rpm", "version": "3.7.3" } ], "newt": [ { "arch": "x86_64", "epoch": null, "name": "newt", "release": "11.el9", "source": "rpm", "version": "0.52.21" } ], "nfs-utils": [ { "arch": "x86_64", "epoch": 1, "name": "nfs-utils", "release": "10.el9", "source": "rpm", "version": "2.5.4" } ], "npth": [ { "arch": "x86_64", "epoch": null, "name": "npth", "release": "8.el9", "source": "rpm", "version": "1.6" } ], "nspr": [ { "arch": "x86_64", "epoch": null, "name": "nspr", "release": "9.el9", "source": "rpm", "version": "4.32.0" } ], "nss": [ { "arch": "x86_64", "epoch": null, "name": "nss", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-softokn": [ { "arch": "x86_64", "epoch": null, "name": "nss-softokn", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-softokn-freebl": [ { "arch": "x86_64", "epoch": null, "name": "nss-softokn-freebl", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-sysinit": [ { "arch": "x86_64", "epoch": null, "name": "nss-sysinit", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-util": [ { "arch": "x86_64", "epoch": null, "name": "nss-util", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "numactl-libs": [ { "arch": "x86_64", "epoch": null, "name": "numactl-libs", "release": "8.el9", "source": "rpm", "version": "2.0.14" } ], "oddjob": [ { "arch": "x86_64", "epoch": null, "name": "oddjob", "release": "5.el9", "source": "rpm", "version": "0.34.7" } ], "oddjob-mkhomedir": [ { "arch": "x86_64", "epoch": null, "name": "oddjob-mkhomedir", "release": "5.el9", "source": "rpm", "version": "0.34.7" } ], "openldap": [ { "arch": "x86_64", "epoch": null, "name": "openldap", "release": "5.el9", "source": "rpm", "version": "2.4.59" } ], "openssh": [ { "arch": "x86_64", "epoch": null, "name": "openssh", "release": "8.el9", "source": "rpm", "version": "8.7p1" } ], "openssh-clients": [ { "arch": "x86_64", "epoch": null, "name": "openssh-clients", "release": "8.el9", "source": "rpm", "version": "8.7p1" } ], "openssh-server": [ { "arch": "x86_64", "epoch": null, "name": "openssh-server", "release": "8.el9", "source": "rpm", "version": "8.7p1" } ], "openssl": [ { "arch": "x86_64", "epoch": 1, "name": "openssl", "release": "33.el9_0", "source": "rpm", "version": "3.0.1" } ], "openssl-libs": [ { "arch": "x86_64", "epoch": 1, "name": "openssl-libs", "release": "33.el9_0", "source": "rpm", "version": "3.0.1" } ], "openssl-pkcs11": [ { "arch": "x86_64", "epoch": null, "name": "openssl-pkcs11", "release": "7.el9", "source": "rpm", "version": "0.4.11" } ], "os-prober": [ { "arch": "x86_64", "epoch": null, "name": "os-prober", "release": "9.el9", "source": "rpm", "version": "1.77" } ], "p11-kit": [ { "arch": "x86_64", "epoch": null, "name": "p11-kit", "release": "2.el9", "source": "rpm", "version": "0.24.1" } ], "p11-kit-trust": [ { "arch": "x86_64", "epoch": null, "name": "p11-kit-trust", "release": "2.el9", "source": "rpm", "version": "0.24.1" } ], "pam": [ { "arch": "x86_64", "epoch": null, "name": "pam", "release": "11.el9", "source": "rpm", "version": "1.5.1" } ], "parted": [ { "arch": "x86_64", "epoch": null, "name": "parted", "release": "6.el9", "source": "rpm", "version": "3.4" } ], "passwd": [ { "arch": "x86_64", "epoch": null, "name": "passwd", "release": "12.el9", "source": "rpm", "version": "0.80" } ], "pciutils": [ { "arch": "x86_64", "epoch": null, "name": "pciutils", "release": "5.el9", "source": "rpm", "version": "3.7.0" } ], "pciutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "pciutils-libs", "release": "5.el9", "source": "rpm", "version": "3.7.0" } ], "pcre": [ { "arch": "x86_64", "epoch": null, "name": "pcre", "release": "3.el9.3", "source": "rpm", "version": "8.44" } ], "pcre2": [ { "arch": "x86_64", "epoch": null, "name": "pcre2", "release": "2.el9", "source": "rpm", "version": "10.40" } ], "pcre2-syntax": [ { "arch": "noarch", "epoch": null, "name": "pcre2-syntax", "release": "2.el9", "source": "rpm", "version": "10.40" } ], "pigz": [ { "arch": "x86_64", "epoch": null, "name": "pigz", "release": "4.el9", "source": "rpm", "version": "2.5" } ], "policycoreutils": [ { "arch": "x86_64", "epoch": null, "name": "policycoreutils", "release": "6.el9_0", "source": "rpm", "version": "3.3" } ], "policycoreutils-python-utils": [ { "arch": "noarch", "epoch": null, "name": "policycoreutils-python-utils", "release": "6.el9_0", "source": "rpm", "version": "3.3" } ], "polkit": [ { "arch": "x86_64", "epoch": null, "name": "polkit", "release": "10.el9_0", "source": "rpm", "version": "0.117" } ], "polkit-libs": [ { "arch": "x86_64", "epoch": null, "name": "polkit-libs", "release": "10.el9_0", "source": "rpm", "version": "0.117" } ], "polkit-pkla-compat": [ { "arch": "x86_64", "epoch": null, "name": "polkit-pkla-compat", "release": "21.el9", "source": "rpm", "version": "0.1" } ], "popt": [ { "arch": "x86_64", "epoch": null, "name": "popt", "release": "8.el9", "source": "rpm", "version": "1.18" } ], "prefixdevname": [ { "arch": "x86_64", "epoch": null, "name": "prefixdevname", "release": "8.el9", "source": "rpm", "version": "0.1.0" } ], "procps-ng": [ { "arch": "x86_64", "epoch": null, "name": "procps-ng", "release": "5.el9", "source": "rpm", "version": "3.3.17" } ], "protobuf-c": [ { "arch": "x86_64", "epoch": null, "name": "protobuf-c", "release": "12.el9", "source": "rpm", "version": "1.3.3" } ], "psmisc": [ { "arch": "x86_64", "epoch": null, "name": "psmisc", "release": "3.el9", "source": "rpm", "version": "23.4" } ], "publicsuffix-list-dafsa": [ { "arch": "noarch", "epoch": null, "name": "publicsuffix-list-dafsa", "release": "3.el9", "source": "rpm", "version": "20210518" } ], "python-unversioned-command": [ { "arch": "noarch", "epoch": null, "name": "python-unversioned-command", "release": "2.el9", "source": "rpm", "version": "3.9.10" } ], "python3": [ { "arch": "x86_64", "epoch": null, "name": "python3", "release": "2.el9", "source": "rpm", "version": "3.9.10" } ], "python3-attrs": [ { "arch": "noarch", "epoch": null, "name": "python3-attrs", "release": "7.el9", "source": "rpm", "version": "20.3.0" } ], "python3-audit": [ { "arch": "x86_64", "epoch": null, "name": "python3-audit", "release": "102.el9", "source": "rpm", "version": "3.0.7" } ], "python3-babel": [ { "arch": "noarch", "epoch": null, "name": "python3-babel", "release": "2.el9", "source": "rpm", "version": "2.9.1" } ], "python3-blivet": [ { "arch": "noarch", "epoch": 1, "name": "python3-blivet", "release": "13.el9_0", "source": "rpm", "version": "3.4.0" } ], "python3-blockdev": [ { "arch": "x86_64", "epoch": null, "name": "python3-blockdev", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "python3-bytesize": [ { "arch": "x86_64", "epoch": null, "name": "python3-bytesize", "release": "3.el9", "source": "rpm", "version": "2.5" } ], "python3-chardet": [ { "arch": "noarch", "epoch": null, "name": "python3-chardet", "release": "5.el9", "source": "rpm", "version": "4.0.0" } ], "python3-cloud-what": [ { "arch": "x86_64", "epoch": null, "name": "python3-cloud-what", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "python3-configobj": [ { "arch": "noarch", "epoch": null, "name": "python3-configobj", "release": "25.el9", "source": "rpm", "version": "5.0.6" } ], "python3-dasbus": [ { "arch": "noarch", "epoch": null, "name": "python3-dasbus", "release": "5.el9", "source": "rpm", "version": "1.4" } ], "python3-dateutil": [ { "arch": "noarch", "epoch": 1, "name": "python3-dateutil", "release": "6.el9", "source": "rpm", "version": "2.8.1" } ], "python3-dbus": [ { "arch": "x86_64", "epoch": null, "name": "python3-dbus", "release": "2.el9", "source": "rpm", "version": "1.2.18" } ], "python3-decorator": [ { "arch": "noarch", "epoch": null, "name": "python3-decorator", "release": "6.el9", "source": "rpm", "version": "4.4.2" } ], "python3-distro": [ { "arch": "noarch", "epoch": null, "name": "python3-distro", "release": "7.el9", "source": "rpm", "version": "1.5.0" } ], "python3-dmidecode": [ { "arch": "x86_64", "epoch": null, "name": "python3-dmidecode", "release": "27.el9", "source": "rpm", "version": "3.12.2" } ], "python3-dnf": [ { "arch": "noarch", "epoch": null, "name": "python3-dnf", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "python3-dnf-plugins-core": [ { "arch": "noarch", "epoch": null, "name": "python3-dnf-plugins-core", "release": "1.el9", "source": "rpm", "version": "4.1.0" } ], "python3-ethtool": [ { "arch": "x86_64", "epoch": null, "name": "python3-ethtool", "release": "2.el9", "source": "rpm", "version": "0.15" } ], "python3-file-magic": [ { "arch": "noarch", "epoch": null, "name": "python3-file-magic", "release": "8.el9", "source": "rpm", "version": "5.39" } ], "python3-gobject-base": [ { "arch": "x86_64", "epoch": null, "name": "python3-gobject-base", "release": "5.el9", "source": "rpm", "version": "3.40.1" } ], "python3-gpg": [ { "arch": "x86_64", "epoch": null, "name": "python3-gpg", "release": "6.el9", "source": "rpm", "version": "1.15.1" } ], "python3-hawkey": [ { "arch": "x86_64", "epoch": null, "name": "python3-hawkey", "release": "1.el9", "source": "rpm", "version": "0.67.0" } ], "python3-idna": [ { "arch": "noarch", "epoch": null, "name": "python3-idna", "release": "7.el9", "source": "rpm", "version": "2.10" } ], "python3-iniparse": [ { "arch": "noarch", "epoch": null, "name": "python3-iniparse", "release": "45.el9", "source": "rpm", "version": "0.4" } ], "python3-inotify": [ { "arch": "noarch", "epoch": null, "name": "python3-inotify", "release": "25.el9", "source": "rpm", "version": "0.9.6" } ], "python3-jinja2": [ { "arch": "noarch", "epoch": null, "name": "python3-jinja2", "release": "4.el9", "source": "rpm", "version": "2.11.3" } ], "python3-jsonpatch": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonpatch", "release": "16.el9", "source": "rpm", "version": "1.21" } ], "python3-jsonpointer": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonpointer", "release": "4.el9", "source": "rpm", "version": "2.0" } ], "python3-jsonschema": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonschema", "release": "13.el9", "source": "rpm", "version": "3.2.0" } ], "python3-libcomps": [ { "arch": "x86_64", "epoch": null, "name": "python3-libcomps", "release": "1.el9", "source": "rpm", "version": "0.1.18" } ], "python3-libdnf": [ { "arch": "x86_64", "epoch": null, "name": "python3-libdnf", "release": "1.el9", "source": "rpm", "version": "0.67.0" } ], "python3-librepo": [ { "arch": "x86_64", "epoch": null, "name": "python3-librepo", "release": "1.el9", "source": "rpm", "version": "1.14.2" } ], "python3-libs": [ { "arch": "x86_64", "epoch": null, "name": "python3-libs", "release": "2.el9", "source": "rpm", "version": "3.9.10" } ], "python3-libselinux": [ { "arch": "x86_64", "epoch": null, "name": "python3-libselinux", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "python3-libsemanage": [ { "arch": "x86_64", "epoch": null, "name": "python3-libsemanage", "release": "3.el9", "source": "rpm", "version": "3.3" } ], "python3-libxml2": [ { "arch": "x86_64", "epoch": null, "name": "python3-libxml2", "release": "2.el9", "source": "rpm", "version": "2.9.13" } ], "python3-linux-procfs": [ { "arch": "noarch", "epoch": null, "name": "python3-linux-procfs", "release": "1.el9", "source": "rpm", "version": "0.7.0" } ], "python3-markupsafe": [ { "arch": "x86_64", "epoch": null, "name": "python3-markupsafe", "release": "12.el9", "source": "rpm", "version": "1.1.1" } ], "python3-netifaces": [ { "arch": "x86_64", "epoch": null, "name": "python3-netifaces", "release": "15.el9", "source": "rpm", "version": "0.10.6" } ], "python3-oauthlib": [ { "arch": "noarch", "epoch": null, "name": "python3-oauthlib", "release": "2.el9", "source": "rpm", "version": "3.1.1" } ], "python3-perf": [ { "arch": "x86_64", "epoch": null, "name": "python3-perf", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "python3-pexpect": [ { "arch": "noarch", "epoch": null, "name": "python3-pexpect", "release": "7.el9", "source": "rpm", "version": "4.8.0" } ], "python3-pip-wheel": [ { "arch": "noarch", "epoch": null, "name": "python3-pip-wheel", "release": "6.el9", "source": "rpm", "version": "21.2.3" } ], "python3-policycoreutils": [ { "arch": "noarch", "epoch": null, "name": "python3-policycoreutils", "release": "6.el9_0", "source": "rpm", "version": "3.3" } ], "python3-prettytable": [ { "arch": "noarch", "epoch": null, "name": "python3-prettytable", "release": "27.el9", "source": "rpm", "version": "0.7.2" } ], "python3-ptyprocess": [ { "arch": "noarch", "epoch": null, "name": "python3-ptyprocess", "release": "12.el9", "source": "rpm", "version": "0.6.0" } ], "python3-pyparted": [ { "arch": "x86_64", "epoch": 1, "name": "python3-pyparted", "release": "4.el9", "source": "rpm", "version": "3.11.7" } ], "python3-pyrsistent": [ { "arch": "x86_64", "epoch": null, "name": "python3-pyrsistent", "release": "8.el9", "source": "rpm", "version": "0.17.3" } ], "python3-pyserial": [ { "arch": "noarch", "epoch": null, "name": "python3-pyserial", "release": "12.el9", "source": "rpm", "version": "3.4" } ], "python3-pysocks": [ { "arch": "noarch", "epoch": null, "name": "python3-pysocks", "release": "12.el9", "source": "rpm", "version": "1.7.1" } ], "python3-pytz": [ { "arch": "noarch", "epoch": null, "name": "python3-pytz", "release": "4.el9", "source": "rpm", "version": "2021.1" } ], "python3-pyudev": [ { "arch": "noarch", "epoch": null, "name": "python3-pyudev", "release": "6.el9", "source": "rpm", "version": "0.22.0" } ], "python3-pyyaml": [ { "arch": "x86_64", "epoch": null, "name": "python3-pyyaml", "release": "6.el9", "source": "rpm", "version": "5.4.1" } ], "python3-requests": [ { "arch": "noarch", "epoch": null, "name": "python3-requests", "release": "6.el9", "source": "rpm", "version": "2.25.1" } ], "python3-rpm": [ { "arch": "x86_64", "epoch": null, "name": "python3-rpm", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "python3-setools": [ { "arch": "x86_64", "epoch": null, "name": "python3-setools", "release": "4.el9", "source": "rpm", "version": "4.4.0" } ], "python3-setuptools": [ { "arch": "noarch", "epoch": null, "name": "python3-setuptools", "release": "10.el9", "source": "rpm", "version": "53.0.0" } ], "python3-setuptools-wheel": [ { "arch": "noarch", "epoch": null, "name": "python3-setuptools-wheel", "release": "10.el9", "source": "rpm", "version": "53.0.0" } ], "python3-six": [ { "arch": "noarch", "epoch": null, "name": "python3-six", "release": "9.el9", "source": "rpm", "version": "1.15.0" } ], "python3-subscription-manager-rhsm": [ { "arch": "x86_64", "epoch": null, "name": "python3-subscription-manager-rhsm", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "python3-systemd": [ { "arch": "x86_64", "epoch": null, "name": "python3-systemd", "release": "18.el9", "source": "rpm", "version": "234" } ], "python3-urllib3": [ { "arch": "noarch", "epoch": null, "name": "python3-urllib3", "release": "3.el9", "source": "rpm", "version": "1.26.5" } ], "qemu-guest-agent": [ { "arch": "x86_64", "epoch": 17, "name": "qemu-guest-agent", "release": "4.el9", "source": "rpm", "version": "7.0.0" } ], "quota": [ { "arch": "x86_64", "epoch": 1, "name": "quota", "release": "6.el9", "source": "rpm", "version": "4.06" } ], "quota-nls": [ { "arch": "noarch", "epoch": 1, "name": "quota-nls", "release": "6.el9", "source": "rpm", "version": "4.06" } ], "readline": [ { "arch": "x86_64", "epoch": null, "name": "readline", "release": "4.el9", "source": "rpm", "version": "8.1" } ], "redhat-logos": [ { "arch": "x86_64", "epoch": null, "name": "redhat-logos", "release": "1.el9", "source": "rpm", "version": "90.4" } ], "redhat-release": [ { "arch": "x86_64", "epoch": null, "name": "redhat-release", "release": "1.3.el9", "source": "rpm", "version": "9.1" } ], "redhat-release-eula": [ { "arch": "x86_64", "epoch": null, "name": "redhat-release-eula", "release": "1.3.el9", "source": "rpm", "version": "9.1" } ], "rhsm-icons": [ { "arch": "noarch", "epoch": null, "name": "rhsm-icons", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "rootfiles": [ { "arch": "noarch", "epoch": null, "name": "rootfiles", "release": "31.el9", "source": "rpm", "version": "8.1" } ], "rpcbind": [ { "arch": "x86_64", "epoch": null, "name": "rpcbind", "release": "2.el9", "source": "rpm", "version": "1.2.6" } ], "rpm": [ { "arch": "x86_64", "epoch": null, "name": "rpm", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-build-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-build-libs", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-libs", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-plugin-audit": [ { "arch": "x86_64", "epoch": null, "name": "rpm-plugin-audit", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-plugin-selinux": [ { "arch": "x86_64", "epoch": null, "name": "rpm-plugin-selinux", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-plugin-systemd-inhibit": [ { "arch": "x86_64", "epoch": null, "name": "rpm-plugin-systemd-inhibit", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-sign-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-sign-libs", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rsync": [ { "arch": "x86_64", "epoch": null, "name": "rsync", "release": "11.el9", "source": "rpm", "version": "3.2.3" } ], "rsyslog": [ { "arch": "x86_64", "epoch": null, "name": "rsyslog", "release": "105.el9", "source": "rpm", "version": "8.2102.0" } ], "rsyslog-logrotate": [ { "arch": "x86_64", "epoch": null, "name": "rsyslog-logrotate", "release": "105.el9", "source": "rpm", "version": "8.2102.0" } ], "sed": [ { "arch": "x86_64", "epoch": null, "name": "sed", "release": "9.el9", "source": "rpm", "version": "4.8" } ], "selinux-policy": [ { "arch": "noarch", "epoch": null, "name": "selinux-policy", "release": "1.el9", "source": "rpm", "version": "34.1.33" } ], "selinux-policy-targeted": [ { "arch": "noarch", "epoch": null, "name": "selinux-policy-targeted", "release": "1.el9", "source": "rpm", "version": "34.1.33" } ], "setroubleshoot-plugins": [ { "arch": "noarch", "epoch": null, "name": "setroubleshoot-plugins", "release": "4.el9", "source": "rpm", "version": "3.3.14" } ], "setroubleshoot-server": [ { "arch": "x86_64", "epoch": null, "name": "setroubleshoot-server", "release": "3.el9_0", "source": "rpm", "version": "3.3.28" } ], "setup": [ { "arch": "noarch", "epoch": null, "name": "setup", "release": "6.el9", "source": "rpm", "version": "2.13.7" } ], "sg3_utils": [ { "arch": "x86_64", "epoch": null, "name": "sg3_utils", "release": "8.el9", "source": "rpm", "version": "1.47" } ], "sg3_utils-libs": [ { "arch": "x86_64", "epoch": null, "name": "sg3_utils-libs", "release": "8.el9", "source": "rpm", "version": "1.47" } ], "shadow-utils": [ { "arch": "x86_64", "epoch": 2, "name": "shadow-utils", "release": "4.el9", "source": "rpm", "version": "4.9" } ], "shared-mime-info": [ { "arch": "x86_64", "epoch": null, "name": "shared-mime-info", "release": "4.el9", "source": "rpm", "version": "2.1" } ], "shim-x64": [ { "arch": "x86_64", "epoch": null, "name": "shim-x64", "release": "2.el9", "source": "rpm", "version": "15.5" } ], "slang": [ { "arch": "x86_64", "epoch": null, "name": "slang", "release": "11.el9", "source": "rpm", "version": "2.3.2" } ], "snappy": [ { "arch": "x86_64", "epoch": null, "name": "snappy", "release": "8.el9", "source": "rpm", "version": "1.1.8" } ], "sos": [ { "arch": "noarch", "epoch": null, "name": "sos", "release": "1.el9", "source": "rpm", "version": "4.3" } ], "sqlite-libs": [ { "arch": "x86_64", "epoch": null, "name": "sqlite-libs", "release": "5.el9", "source": "rpm", "version": "3.34.1" } ], "squashfs-tools": [ { "arch": "x86_64", "epoch": null, "name": "squashfs-tools", "release": "8.git1.el9", "source": "rpm", "version": "4.4" } ], "sscg": [ { "arch": "x86_64", "epoch": null, "name": "sscg", "release": "5.el9", "source": "rpm", "version": "3.0.0" } ], "sssd-client": [ { "arch": "x86_64", "epoch": null, "name": "sssd-client", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "sssd-common": [ { "arch": "x86_64", "epoch": null, "name": "sssd-common", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "sssd-kcm": [ { "arch": "x86_64", "epoch": null, "name": "sssd-kcm", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "sssd-nfs-idmap": [ { "arch": "x86_64", "epoch": null, "name": "sssd-nfs-idmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "subscription-manager": [ { "arch": "x86_64", "epoch": null, "name": "subscription-manager", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "subscription-manager-cockpit": [ { "arch": "noarch", "epoch": null, "name": "subscription-manager-cockpit", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "subscription-manager-rhsm-certificates": [ { "arch": "x86_64", "epoch": null, "name": "subscription-manager-rhsm-certificates", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "sudo": [ { "arch": "x86_64", "epoch": null, "name": "sudo", "release": "7.el9", "source": "rpm", "version": "1.9.5p2" } ], "systemd": [ { "arch": "x86_64", "epoch": null, "name": "systemd", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-libs": [ { "arch": "x86_64", "epoch": null, "name": "systemd-libs", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-pam": [ { "arch": "x86_64", "epoch": null, "name": "systemd-pam", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-rpm-macros": [ { "arch": "noarch", "epoch": null, "name": "systemd-rpm-macros", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-udev": [ { "arch": "x86_64", "epoch": null, "name": "systemd-udev", "release": "7.el9", "source": "rpm", "version": "250" } ], "tar": [ { "arch": "x86_64", "epoch": 2, "name": "tar", "release": "3.el9", "source": "rpm", "version": "1.34" } ], "tcpdump": [ { "arch": "x86_64", "epoch": 14, "name": "tcpdump", "release": "6.el9", "source": "rpm", "version": "4.99.0" } ], "teamd": [ { "arch": "x86_64", "epoch": null, "name": "teamd", "release": "11.el9", "source": "rpm", "version": "1.31" } ], "tpm2-tss": [ { "arch": "x86_64", "epoch": null, "name": "tpm2-tss", "release": "7.el9", "source": "rpm", "version": "3.0.3" } ], "tuned": [ { "arch": "noarch", "epoch": null, "name": "tuned", "release": "2.el9", "source": "rpm", "version": "2.18.0" } ], "tzdata": [ { "arch": "noarch", "epoch": null, "name": "tzdata", "release": "1.el9", "source": "rpm", "version": "2022a" } ], "usermode": [ { "arch": "x86_64", "epoch": null, "name": "usermode", "release": "4.el9", "source": "rpm", "version": "1.114" } ], "userspace-rcu": [ { "arch": "x86_64", "epoch": null, "name": "userspace-rcu", "release": "6.el9", "source": "rpm", "version": "0.12.1" } ], "util-linux": [ { "arch": "x86_64", "epoch": null, "name": "util-linux", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "util-linux-core": [ { "arch": "x86_64", "epoch": null, "name": "util-linux-core", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "vdo": [ { "arch": "x86_64", "epoch": null, "name": "vdo", "release": "1.el9", "source": "rpm", "version": "8.1.1.360" } ], "vim-minimal": [ { "arch": "x86_64", "epoch": 2, "name": "vim-minimal", "release": "16.el9_0.2", "source": "rpm", "version": "8.2.2637" } ], "virt-what": [ { "arch": "x86_64", "epoch": null, "name": "virt-what", "release": "1.el9", "source": "rpm", "version": "1.22" } ], "volume_key-libs": [ { "arch": "x86_64", "epoch": null, "name": "volume_key-libs", "release": "15.el9", "source": "rpm", "version": "0.3.12" } ], "webkit2gtk3-jsc": [ { "arch": "x86_64", "epoch": null, "name": "webkit2gtk3-jsc", "release": "1.el9", "source": "rpm", "version": "2.36.1" } ], "which": [ { "arch": "x86_64", "epoch": null, "name": "which", "release": "28.el9", "source": "rpm", "version": "2.21" } ], "xfsprogs": [ { "arch": "x86_64", "epoch": null, "name": "xfsprogs", "release": "1.el9", "source": "rpm", "version": "5.14.2" } ], "xz": [ { "arch": "x86_64", "epoch": null, "name": "xz", "release": "7.el9", "source": "rpm", "version": "5.2.5" } ], "xz-libs": [ { "arch": "x86_64", "epoch": null, "name": "xz-libs", "release": "7.el9", "source": "rpm", "version": "5.2.5" } ], "yum": [ { "arch": "noarch", "epoch": null, "name": "yum", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "yum-utils": [ { "arch": "noarch", "epoch": null, "name": "yum-utils", "release": "1.el9", "source": "rpm", "version": "4.1.0" } ], "zlib": [ { "arch": "x86_64", "epoch": null, "name": "zlib", "release": "33.el9", "source": "rpm", "version": "1.2.11" } ], "zstd": [ { "arch": "x86_64", "epoch": null, "name": "zstd", "release": "2.el9", "source": "rpm", "version": "1.5.1" } ] } }, "changed": false } TASK [Set blivet package name] ************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:29 Wednesday 01 June 2022 16:38:45 +0000 (0:00:01.072) 0:00:11.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "blivet_pkg_name": [ "python3-blivet" ] }, "changed": false } TASK [Set blivet package version] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:33 Wednesday 01 June 2022 16:38:45 +0000 (0:00:00.082) 0:00:11.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "blivet_pkg_version": "3.4.0-13.el9_0" }, "changed": false } TASK [Check if kvdo is loadable] *********************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:37 Wednesday 01 June 2022 16:38:45 +0000 (0:00:00.077) 0:00:11.507 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "cmd": "set -euo pipefail\nmodprobe --dry-run kvdo\n", "delta": "0:00:00.004925", "end": "2022-06-01 12:38:45.152848", "rc": 1, "start": "2022-06-01 12:38:45.147923" } STDERR: modprobe: FATAL: Module kvdo not found in directory /lib/modules/5.14.0-101.el9.x86_64 MSG: non-zero return code ...ignoring TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:46 Wednesday 01 June 2022 16:38:45 +0000 (0:00:00.547) 0:00:12.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create LVM VDO volume under volume group 'pool1'] ************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:51 Wednesday 01 June 2022 16:38:45 +0000 (0:00:00.082) 0:00:12.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:66 Wednesday 01 June 2022 16:38:45 +0000 (0:00:00.080) 0:00:12.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:68 Wednesday 01 June 2022 16:38:45 +0000 (0:00:00.079) 0:00:12.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:83 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.081) 0:00:12.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove LVM VDO volume in 'pool1' created above] ************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:85 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.078) 0:00:12.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:101 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.078) 0:00:12.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create LVM VDO volume under volume group 'pool1' (this time default size)] *** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:103 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.080) 0:00:12.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:117 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.077) 0:00:12.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove LVM VDO volume in 'pool1' created above] ************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:119 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.078) 0:00:12.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:134 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.087) 0:00:12.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=27 changed=0 unreachable=0 failed=0 skipped=23 rescued=0 ignored=1 Wednesday 01 June 2022 16:38:46 +0000 (0:00:00.039) 0:00:12.901 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:2 ------------------ linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gather package facts ---------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:25 ----------------- linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Check if kvdo is loadable ----------------------------------------------- 0.55s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:37 ----------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.50s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.47s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ Remove LVM VDO volume in 'pool1' created above -------------------------- 0.09s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:119 ---------------- include_tasks ----------------------------------------------------------- 0.08s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:46 ----------------- Set blivet package name ------------------------------------------------- 0.08s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:29 ----------------- Repeat the previous invocation to verify idempotence -------------------- 0.08s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:68 ----------------- include_tasks ----------------------------------------------------------- 0.08s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:101 ---------------- Create LVM VDO volume under volume group 'pool1' ------------------------ 0.08s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:51 ----------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:38:47 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:38:48 +0000 (0:00:01.232) 0:00:01.256 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.23s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvmvdo_then_remove_nvme_generated.yml ******************* 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:38:48 +0000 (0:00:00.022) 0:00:01.278 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.23s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:38:49 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:38:50 +0000 (0:00:01.272) 0:00:01.295 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_lvmvdo_then_remove_scsi_generated.yml ******************* 2 plays in /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove_scsi_generated.yml:3 Wednesday 01 June 2022 16:38:50 +0000 (0:00:00.020) 0:00:01.315 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove_scsi_generated.yml:7 Wednesday 01 June 2022 16:38:51 +0000 (0:00:01.028) 0:00:02.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:2 Wednesday 01 June 2022 16:38:51 +0000 (0:00:00.025) 0:00:02.369 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:15 Wednesday 01 June 2022 16:38:52 +0000 (0:00:00.794) 0:00:03.163 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:38:52 +0000 (0:00:00.039) 0:00:03.203 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:38:52 +0000 (0:00:00.170) 0:00:03.373 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:38:53 +0000 (0:00:00.526) 0:00:03.899 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:38:53 +0000 (0:00:00.077) 0:00:03.977 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:38:53 +0000 (0:00:00.023) 0:00:04.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:38:53 +0000 (0:00:00.022) 0:00:04.022 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:38:53 +0000 (0:00:00.193) 0:00:04.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:38:53 +0000 (0:00:00.019) 0:00:04.235 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:38:54 +0000 (0:00:00.990) 0:00:05.226 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:38:54 +0000 (0:00:00.046) 0:00:05.272 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:38:54 +0000 (0:00:00.044) 0:00:05.317 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:38:55 +0000 (0:00:00.694) 0:00:06.012 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:38:55 +0000 (0:00:00.082) 0:00:06.094 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:38:55 +0000 (0:00:00.020) 0:00:06.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:38:55 +0000 (0:00:00.022) 0:00:06.137 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:38:55 +0000 (0:00:00.020) 0:00:06.157 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:38:56 +0000 (0:00:00.823) 0:00:06.980 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:38:58 +0000 (0:00:01.798) 0:00:08.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.042) 0:00:08.821 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.026) 0:00:08.848 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.522) 0:00:09.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.028) 0:00:09.399 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.024) 0:00:09.424 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.030) 0:00:09.454 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.028) 0:00:09.483 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.028) 0:00:09.512 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.025) 0:00:09.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.028) 0:00:09.567 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.025) 0:00:09.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:38:58 +0000 (0:00:00.027) 0:00:09.620 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:38:59 +0000 (0:00:00.450) 0:00:10.070 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:38:59 +0000 (0:00:00.029) 0:00:10.099 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:18 Wednesday 01 June 2022 16:39:00 +0000 (0:00:00.831) 0:00:10.931 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [Gather package facts] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:25 Wednesday 01 June 2022 16:39:00 +0000 (0:00:00.030) 0:00:10.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "packages": { "NetworkManager": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "NetworkManager-libnm": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager-libnm", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "NetworkManager-team": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager-team", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "NetworkManager-tui": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager-tui", "release": "1.el9", "source": "rpm", "version": "1.39.5" } ], "PackageKit": [ { "arch": "x86_64", "epoch": null, "name": "PackageKit", "release": "2.el9", "source": "rpm", "version": "1.2.4" } ], "PackageKit-glib": [ { "arch": "x86_64", "epoch": null, "name": "PackageKit-glib", "release": "2.el9", "source": "rpm", "version": "1.2.4" } ], "abattis-cantarell-fonts": [ { "arch": "noarch", "epoch": null, "name": "abattis-cantarell-fonts", "release": "4.el9", "source": "rpm", "version": "0.301" } ], "acl": [ { "arch": "x86_64", "epoch": null, "name": "acl", "release": "3.el9", "source": "rpm", "version": "2.3.1" } ], "alternatives": [ { "arch": "x86_64", "epoch": null, "name": "alternatives", "release": "2.el9", "source": "rpm", "version": "1.20" } ], "audit": [ { "arch": "x86_64", "epoch": null, "name": "audit", "release": "102.el9", "source": "rpm", "version": "3.0.7" } ], "audit-libs": [ { "arch": "x86_64", "epoch": null, "name": "audit-libs", "release": "102.el9", "source": "rpm", "version": "3.0.7" } ], "authselect": [ { "arch": "x86_64", "epoch": null, "name": "authselect", "release": "1.el9", "source": "rpm", "version": "1.2.5" } ], "authselect-compat": [ { "arch": "x86_64", "epoch": null, "name": "authselect-compat", "release": "1.el9", "source": "rpm", "version": "1.2.5" } ], "authselect-libs": [ { "arch": "x86_64", "epoch": null, "name": "authselect-libs", "release": "1.el9", "source": "rpm", "version": "1.2.5" } ], "basesystem": [ { "arch": "noarch", "epoch": null, "name": "basesystem", "release": "13.el9", "source": "rpm", "version": "11" } ], "bash": [ { "arch": "x86_64", "epoch": null, "name": "bash", "release": "4.el9", "source": "rpm", "version": "5.1.8" } ], "blivet-data": [ { "arch": "noarch", "epoch": 1, "name": "blivet-data", "release": "13.el9_0", "source": "rpm", "version": "3.4.0" } ], "bubblewrap": [ { "arch": "x86_64", "epoch": null, "name": "bubblewrap", "release": "6.el9", "source": "rpm", "version": "0.4.1" } ], "bzip2": [ { "arch": "x86_64", "epoch": null, "name": "bzip2", "release": "8.el9", "source": "rpm", "version": "1.0.8" } ], "bzip2-libs": [ { "arch": "x86_64", "epoch": null, "name": "bzip2-libs", "release": "8.el9", "source": "rpm", "version": "1.0.8" } ], "c-ares": [ { "arch": "x86_64", "epoch": null, "name": "c-ares", "release": "5.el9", "source": "rpm", "version": "1.17.1" } ], "ca-certificates": [ { "arch": "noarch", "epoch": null, "name": "ca-certificates", "release": "94.el9", "source": "rpm", "version": "2020.2.50" } ], "checkpolicy": [ { "arch": "x86_64", "epoch": null, "name": "checkpolicy", "release": "1.el9", "source": "rpm", "version": "3.3" } ], "chrony": [ { "arch": "x86_64", "epoch": null, "name": "chrony", "release": "1.el9", "source": "rpm", "version": "4.2" } ], "cloud-init": [ { "arch": "noarch", "epoch": null, "name": "cloud-init", "release": "1.el9", "source": "rpm", "version": "22.1" } ], "cloud-utils-growpart": [ { "arch": "x86_64", "epoch": null, "name": "cloud-utils-growpart", "release": "10.el9", "source": "rpm", "version": "0.31" } ], "cockpit-bridge": [ { "arch": "x86_64", "epoch": null, "name": "cockpit-bridge", "release": "1.el9", "source": "rpm", "version": "269" } ], "cockpit-system": [ { "arch": "noarch", "epoch": null, "name": "cockpit-system", "release": "1.el9", "source": "rpm", "version": "269" } ], "cockpit-ws": [ { "arch": "x86_64", "epoch": null, "name": "cockpit-ws", "release": "1.el9", "source": "rpm", "version": "269" } ], "coreutils": [ { "arch": "x86_64", "epoch": null, "name": "coreutils", "release": "31.el9", "source": "rpm", "version": "8.32" } ], "coreutils-common": [ { "arch": "x86_64", "epoch": null, "name": "coreutils-common", "release": "31.el9", "source": "rpm", "version": "8.32" } ], "cpio": [ { "arch": "x86_64", "epoch": null, "name": "cpio", "release": "16.el9", "source": "rpm", "version": "2.13" } ], "cracklib": [ { "arch": "x86_64", "epoch": null, "name": "cracklib", "release": "27.el9", "source": "rpm", "version": "2.9.6" } ], "cracklib-dicts": [ { "arch": "x86_64", "epoch": null, "name": "cracklib-dicts", "release": "27.el9", "source": "rpm", "version": "2.9.6" } ], "cronie": [ { "arch": "x86_64", "epoch": null, "name": "cronie", "release": "5.el9", "source": "rpm", "version": "1.5.7" } ], "cronie-anacron": [ { "arch": "x86_64", "epoch": null, "name": "cronie-anacron", "release": "5.el9", "source": "rpm", "version": "1.5.7" } ], "crontabs": [ { "arch": "noarch", "epoch": null, "name": "crontabs", "release": "27.20190603git.el9_0", "source": "rpm", "version": "1.11" } ], "crypto-policies": [ { "arch": "noarch", "epoch": null, "name": "crypto-policies", "release": "1.gitb2323a1.el9", "source": "rpm", "version": "20220427" } ], "crypto-policies-scripts": [ { "arch": "noarch", "epoch": null, "name": "crypto-policies-scripts", "release": "1.gitb2323a1.el9", "source": "rpm", "version": "20220427" } ], "cryptsetup-libs": [ { "arch": "x86_64", "epoch": null, "name": "cryptsetup-libs", "release": "4.el9", "source": "rpm", "version": "2.4.3" } ], "curl": [ { "arch": "x86_64", "epoch": null, "name": "curl", "release": "18.el9", "source": "rpm", "version": "7.76.1" } ], "cyrus-sasl-lib": [ { "arch": "x86_64", "epoch": null, "name": "cyrus-sasl-lib", "release": "20.el9", "source": "rpm", "version": "2.1.27" } ], "daxctl-libs": [ { "arch": "x86_64", "epoch": null, "name": "daxctl-libs", "release": "6.el9", "source": "rpm", "version": "71.1" } ], "dbus": [ { "arch": "x86_64", "epoch": 1, "name": "dbus", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "dbus-broker": [ { "arch": "x86_64", "epoch": null, "name": "dbus-broker", "release": "5.el9", "source": "rpm", "version": "28" } ], "dbus-common": [ { "arch": "noarch", "epoch": 1, "name": "dbus-common", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "dbus-libs": [ { "arch": "x86_64", "epoch": 1, "name": "dbus-libs", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "dbus-tools": [ { "arch": "x86_64", "epoch": 1, "name": "dbus-tools", "release": "5.el9", "source": "rpm", "version": "1.12.20" } ], "device-mapper": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-event": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper-event", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-event-libs": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper-event-libs", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-libs": [ { "arch": "x86_64", "epoch": 9, "name": "device-mapper-libs", "release": "4.el9", "source": "rpm", "version": "1.02.183" } ], "device-mapper-multipath": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-multipath", "release": "9.el9", "source": "rpm", "version": "0.8.7" } ], "device-mapper-multipath-libs": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-multipath-libs", "release": "9.el9", "source": "rpm", "version": "0.8.7" } ], "device-mapper-persistent-data": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-persistent-data", "release": "12.el9", "source": "rpm", "version": "0.9.0" } ], "dhcp-client": [ { "arch": "x86_64", "epoch": 12, "name": "dhcp-client", "release": "15.b1.el9", "source": "rpm", "version": "4.4.2" } ], "dhcp-common": [ { "arch": "noarch", "epoch": 12, "name": "dhcp-common", "release": "15.b1.el9", "source": "rpm", "version": "4.4.2" } ], "diffutils": [ { "arch": "x86_64", "epoch": null, "name": "diffutils", "release": "12.el9", "source": "rpm", "version": "3.7" } ], "dmidecode": [ { "arch": "x86_64", "epoch": 1, "name": "dmidecode", "release": "7.el9", "source": "rpm", "version": "3.3" } ], "dnf": [ { "arch": "noarch", "epoch": null, "name": "dnf", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "dnf-data": [ { "arch": "noarch", "epoch": null, "name": "dnf-data", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "dnf-plugins-core": [ { "arch": "noarch", "epoch": null, "name": "dnf-plugins-core", "release": "1.el9", "source": "rpm", "version": "4.1.0" } ], "dosfstools": [ { "arch": "x86_64", "epoch": null, "name": "dosfstools", "release": "3.el9", "source": "rpm", "version": "4.2" } ], "dracut": [ { "arch": "x86_64", "epoch": null, "name": "dracut", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "dracut-config-generic": [ { "arch": "x86_64", "epoch": null, "name": "dracut-config-generic", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "dracut-network": [ { "arch": "x86_64", "epoch": null, "name": "dracut-network", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "dracut-squash": [ { "arch": "x86_64", "epoch": null, "name": "dracut-squash", "release": "45.git20220404.el9_0", "source": "rpm", "version": "055" } ], "e2fsprogs": [ { "arch": "x86_64", "epoch": null, "name": "e2fsprogs", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "e2fsprogs-libs": [ { "arch": "x86_64", "epoch": null, "name": "e2fsprogs-libs", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "efi-filesystem": [ { "arch": "noarch", "epoch": null, "name": "efi-filesystem", "release": "2.el9_0", "source": "rpm", "version": "6" } ], "efibootmgr": [ { "arch": "x86_64", "epoch": null, "name": "efibootmgr", "release": "12.el9", "source": "rpm", "version": "16" } ], "efivar-libs": [ { "arch": "x86_64", "epoch": null, "name": "efivar-libs", "release": "2.el9", "source": "rpm", "version": "38" } ], "elfutils-default-yama-scope": [ { "arch": "noarch", "epoch": null, "name": "elfutils-default-yama-scope", "release": "4.el9", "source": "rpm", "version": "0.187" } ], "elfutils-libelf": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-libelf", "release": "4.el9", "source": "rpm", "version": "0.187" } ], "elfutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-libs", "release": "4.el9", "source": "rpm", "version": "0.187" } ], "ethtool": [ { "arch": "x86_64", "epoch": 2, "name": "ethtool", "release": "1.el9", "source": "rpm", "version": "5.16" } ], "expat": [ { "arch": "x86_64", "epoch": null, "name": "expat", "release": "1.el9", "source": "rpm", "version": "2.4.7" } ], "file": [ { "arch": "x86_64", "epoch": null, "name": "file", "release": "8.el9", "source": "rpm", "version": "5.39" } ], "file-libs": [ { "arch": "x86_64", "epoch": null, "name": "file-libs", "release": "8.el9", "source": "rpm", "version": "5.39" } ], "filesystem": [ { "arch": "x86_64", "epoch": null, "name": "filesystem", "release": "2.el9", "source": "rpm", "version": "3.16" } ], "findutils": [ { "arch": "x86_64", "epoch": 1, "name": "findutils", "release": "5.el9", "source": "rpm", "version": "4.8.0" } ], "flashrom": [ { "arch": "x86_64", "epoch": null, "name": "flashrom", "release": "10.el9", "source": "rpm", "version": "1.2" } ], "fonts-filesystem": [ { "arch": "noarch", "epoch": 1, "name": "fonts-filesystem", "release": "7.el9.1", "source": "rpm", "version": "2.0.5" } ], "fuse-libs": [ { "arch": "x86_64", "epoch": null, "name": "fuse-libs", "release": "15.el9", "source": "rpm", "version": "2.9.9" } ], "fwupd": [ { "arch": "x86_64", "epoch": null, "name": "fwupd", "release": "2.el9_0", "source": "rpm", "version": "1.7.4" } ], "fwupd-plugin-flashrom": [ { "arch": "x86_64", "epoch": null, "name": "fwupd-plugin-flashrom", "release": "2.el9_0", "source": "rpm", "version": "1.7.4" } ], "gawk": [ { "arch": "x86_64", "epoch": null, "name": "gawk", "release": "6.el9", "source": "rpm", "version": "5.1.0" } ], "gawk-all-langpacks": [ { "arch": "x86_64", "epoch": null, "name": "gawk-all-langpacks", "release": "6.el9", "source": "rpm", "version": "5.1.0" } ], "gdbm-libs": [ { "arch": "x86_64", "epoch": 1, "name": "gdbm-libs", "release": "4.el9", "source": "rpm", "version": "1.19" } ], "gdisk": [ { "arch": "x86_64", "epoch": null, "name": "gdisk", "release": "5.el9", "source": "rpm", "version": "1.0.7" } ], "gdk-pixbuf2": [ { "arch": "x86_64", "epoch": null, "name": "gdk-pixbuf2", "release": "2.el9", "source": "rpm", "version": "2.42.6" } ], "geolite2-city": [ { "arch": "noarch", "epoch": null, "name": "geolite2-city", "release": "6.el9", "source": "rpm", "version": "20191217" } ], "geolite2-country": [ { "arch": "noarch", "epoch": null, "name": "geolite2-country", "release": "6.el9", "source": "rpm", "version": "20191217" } ], "gettext": [ { "arch": "x86_64", "epoch": null, "name": "gettext", "release": "7.el9", "source": "rpm", "version": "0.21" } ], "gettext-libs": [ { "arch": "x86_64", "epoch": null, "name": "gettext-libs", "release": "7.el9", "source": "rpm", "version": "0.21" } ], "glib-networking": [ { "arch": "x86_64", "epoch": null, "name": "glib-networking", "release": "3.el9", "source": "rpm", "version": "2.68.3" } ], "glib2": [ { "arch": "x86_64", "epoch": null, "name": "glib2", "release": "5.el9", "source": "rpm", "version": "2.68.4" } ], "glibc": [ { "arch": "x86_64", "epoch": null, "name": "glibc", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "glibc-common": [ { "arch": "x86_64", "epoch": null, "name": "glibc-common", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "glibc-gconv-extra": [ { "arch": "x86_64", "epoch": null, "name": "glibc-gconv-extra", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "glibc-minimal-langpack": [ { "arch": "x86_64", "epoch": null, "name": "glibc-minimal-langpack", "release": "32.el9", "source": "rpm", "version": "2.34" } ], "gmp": [ { "arch": "x86_64", "epoch": 1, "name": "gmp", "release": "10.el9", "source": "rpm", "version": "6.2.0" } ], "gnupg2": [ { "arch": "x86_64", "epoch": null, "name": "gnupg2", "release": "1.el9", "source": "rpm", "version": "2.3.3" } ], "gnutls": [ { "arch": "x86_64", "epoch": null, "name": "gnutls", "release": "9.el9", "source": "rpm", "version": "3.7.3" } ], "gobject-introspection": [ { "arch": "x86_64", "epoch": null, "name": "gobject-introspection", "release": "10.el9", "source": "rpm", "version": "1.68.0" } ], "gpgme": [ { "arch": "x86_64", "epoch": null, "name": "gpgme", "release": "6.el9", "source": "rpm", "version": "1.15.1" } ], "grep": [ { "arch": "x86_64", "epoch": null, "name": "grep", "release": "5.el9", "source": "rpm", "version": "3.6" } ], "groff-base": [ { "arch": "x86_64", "epoch": null, "name": "groff-base", "release": "10.el9", "source": "rpm", "version": "1.22.4" } ], "grub2-common": [ { "arch": "noarch", "epoch": 1, "name": "grub2-common", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-efi-x64": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-efi-x64", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-pc": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-pc", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-pc-modules": [ { "arch": "noarch", "epoch": 1, "name": "grub2-pc-modules", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-tools": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-tools", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grub2-tools-minimal": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-tools-minimal", "release": "32.el9", "source": "rpm", "version": "2.06" } ], "grubby": [ { "arch": "x86_64", "epoch": null, "name": "grubby", "release": "55.el9", "source": "rpm", "version": "8.40" } ], "gsettings-desktop-schemas": [ { "arch": "x86_64", "epoch": null, "name": "gsettings-desktop-schemas", "release": "4.el9", "source": "rpm", "version": "40.0" } ], "gssproxy": [ { "arch": "x86_64", "epoch": null, "name": "gssproxy", "release": "4.el9", "source": "rpm", "version": "0.8.4" } ], "gzip": [ { "arch": "x86_64", "epoch": null, "name": "gzip", "release": "1.el9", "source": "rpm", "version": "1.12" } ], "hdparm": [ { "arch": "x86_64", "epoch": null, "name": "hdparm", "release": "2.el9", "source": "rpm", "version": "9.62" } ], "hostname": [ { "arch": "x86_64", "epoch": null, "name": "hostname", "release": "6.el9", "source": "rpm", "version": "3.23" } ], "hwdata": [ { "arch": "noarch", "epoch": null, "name": "hwdata", "release": "9.3.el9", "source": "rpm", "version": "0.348" } ], "ima-evm-utils": [ { "arch": "x86_64", "epoch": null, "name": "ima-evm-utils", "release": "4.el9", "source": "rpm", "version": "1.4" } ], "inih": [ { "arch": "x86_64", "epoch": null, "name": "inih", "release": "5.el9", "source": "rpm", "version": "49" } ], "initscripts-service": [ { "arch": "noarch", "epoch": null, "name": "initscripts-service", "release": "1.el9", "source": "rpm", "version": "10.11.4" } ], "insights-client": [ { "arch": "noarch", "epoch": 0, "name": "insights-client", "release": "8.el9", "source": "rpm", "version": "3.1.7" } ], "ipcalc": [ { "arch": "x86_64", "epoch": null, "name": "ipcalc", "release": "5.el9", "source": "rpm", "version": "1.0.0" } ], "iproute": [ { "arch": "x86_64", "epoch": null, "name": "iproute", "release": "2.2.el9_0", "source": "rpm", "version": "5.15.0" } ], "iproute-tc": [ { "arch": "x86_64", "epoch": null, "name": "iproute-tc", "release": "2.2.el9_0", "source": "rpm", "version": "5.15.0" } ], "iptables-libs": [ { "arch": "x86_64", "epoch": null, "name": "iptables-libs", "release": "28.el9", "source": "rpm", "version": "1.8.7" } ], "iputils": [ { "arch": "x86_64", "epoch": null, "name": "iputils", "release": "7.el9", "source": "rpm", "version": "20210202" } ], "irqbalance": [ { "arch": "x86_64", "epoch": 2, "name": "irqbalance", "release": "5.el9", "source": "rpm", "version": "1.8.0" } ], "jansson": [ { "arch": "x86_64", "epoch": null, "name": "jansson", "release": "1.el9", "source": "rpm", "version": "2.14" } ], "json-c": [ { "arch": "x86_64", "epoch": null, "name": "json-c", "release": "11.el9", "source": "rpm", "version": "0.14" } ], "json-glib": [ { "arch": "x86_64", "epoch": null, "name": "json-glib", "release": "1.el9", "source": "rpm", "version": "1.6.6" } ], "kbd": [ { "arch": "x86_64", "epoch": null, "name": "kbd", "release": "8.el9", "source": "rpm", "version": "2.4.0" } ], "kbd-misc": [ { "arch": "noarch", "epoch": null, "name": "kbd-misc", "release": "8.el9", "source": "rpm", "version": "2.4.0" } ], "kernel": [ { "arch": "x86_64", "epoch": null, "name": "kernel", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-core": [ { "arch": "x86_64", "epoch": null, "name": "kernel-core", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-modules": [ { "arch": "x86_64", "epoch": null, "name": "kernel-modules", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-tools": [ { "arch": "x86_64", "epoch": null, "name": "kernel-tools", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kernel-tools-libs": [ { "arch": "x86_64", "epoch": null, "name": "kernel-tools-libs", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "kexec-tools": [ { "arch": "x86_64", "epoch": null, "name": "kexec-tools", "release": "2.el9", "source": "rpm", "version": "2.0.24" } ], "keyutils": [ { "arch": "x86_64", "epoch": null, "name": "keyutils", "release": "4.el9", "source": "rpm", "version": "1.6.1" } ], "keyutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "keyutils-libs", "release": "4.el9", "source": "rpm", "version": "1.6.1" } ], "kmod": [ { "arch": "x86_64", "epoch": null, "name": "kmod", "release": "7.el9", "source": "rpm", "version": "28" } ], "kmod-kvdo": [ { "arch": "x86_64", "epoch": null, "name": "kmod-kvdo", "release": "24.el9_0", "source": "rpm", "version": "8.1.1.371" } ], "kmod-libs": [ { "arch": "x86_64", "epoch": null, "name": "kmod-libs", "release": "7.el9", "source": "rpm", "version": "28" } ], "kpartx": [ { "arch": "x86_64", "epoch": null, "name": "kpartx", "release": "9.el9", "source": "rpm", "version": "0.8.7" } ], "krb5-libs": [ { "arch": "x86_64", "epoch": null, "name": "krb5-libs", "release": "18.el9", "source": "rpm", "version": "1.19.1" } ], "less": [ { "arch": "x86_64", "epoch": null, "name": "less", "release": "1.el9_0", "source": "rpm", "version": "590" } ], "libacl": [ { "arch": "x86_64", "epoch": null, "name": "libacl", "release": "3.el9", "source": "rpm", "version": "2.3.1" } ], "libaio": [ { "arch": "x86_64", "epoch": null, "name": "libaio", "release": "13.el9", "source": "rpm", "version": "0.3.111" } ], "libappstream-glib": [ { "arch": "x86_64", "epoch": null, "name": "libappstream-glib", "release": "4.el9", "source": "rpm", "version": "0.7.18" } ], "libarchive": [ { "arch": "x86_64", "epoch": null, "name": "libarchive", "release": "2.el9_0", "source": "rpm", "version": "3.5.3" } ], "libassuan": [ { "arch": "x86_64", "epoch": null, "name": "libassuan", "release": "3.el9", "source": "rpm", "version": "2.5.5" } ], "libattr": [ { "arch": "x86_64", "epoch": null, "name": "libattr", "release": "3.el9", "source": "rpm", "version": "2.5.1" } ], "libbasicobjects": [ { "arch": "x86_64", "epoch": null, "name": "libbasicobjects", "release": "53.el9", "source": "rpm", "version": "0.1.1" } ], "libblkid": [ { "arch": "x86_64", "epoch": null, "name": "libblkid", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libblockdev": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-crypto": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-crypto", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-dm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-dm", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-fs": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-fs", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-kbd": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-kbd", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-loop": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-loop", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-lvm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-lvm", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-mdraid": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-mdraid", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-mpath": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-mpath", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-nvdimm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-nvdimm", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-part": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-part", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-swap": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-swap", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libblockdev-utils": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-utils", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "libbpf": [ { "arch": "x86_64", "epoch": 2, "name": "libbpf", "release": "4.el9", "source": "rpm", "version": "0.5.0" } ], "libbrotli": [ { "arch": "x86_64", "epoch": null, "name": "libbrotli", "release": "6.el9", "source": "rpm", "version": "1.0.9" } ], "libbytesize": [ { "arch": "x86_64", "epoch": null, "name": "libbytesize", "release": "3.el9", "source": "rpm", "version": "2.5" } ], "libcap": [ { "arch": "x86_64", "epoch": null, "name": "libcap", "release": "8.el9", "source": "rpm", "version": "2.48" } ], "libcap-ng": [ { "arch": "x86_64", "epoch": null, "name": "libcap-ng", "release": "7.el9", "source": "rpm", "version": "0.8.2" } ], "libcbor": [ { "arch": "x86_64", "epoch": null, "name": "libcbor", "release": "5.el9", "source": "rpm", "version": "0.7.0" } ], "libcollection": [ { "arch": "x86_64", "epoch": null, "name": "libcollection", "release": "53.el9", "source": "rpm", "version": "0.7.0" } ], "libcom_err": [ { "arch": "x86_64", "epoch": null, "name": "libcom_err", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "libcomps": [ { "arch": "x86_64", "epoch": null, "name": "libcomps", "release": "1.el9", "source": "rpm", "version": "0.1.18" } ], "libcurl": [ { "arch": "x86_64", "epoch": null, "name": "libcurl", "release": "18.el9", "source": "rpm", "version": "7.76.1" } ], "libdaemon": [ { "arch": "x86_64", "epoch": null, "name": "libdaemon", "release": "23.el9", "source": "rpm", "version": "0.14" } ], "libdb": [ { "arch": "x86_64", "epoch": null, "name": "libdb", "release": "53.el9", "source": "rpm", "version": "5.3.28" } ], "libdhash": [ { "arch": "x86_64", "epoch": null, "name": "libdhash", "release": "53.el9", "source": "rpm", "version": "0.5.0" } ], "libdnf": [ { "arch": "x86_64", "epoch": null, "name": "libdnf", "release": "1.el9", "source": "rpm", "version": "0.67.0" } ], "libdnf-plugin-subscription-manager": [ { "arch": "x86_64", "epoch": null, "name": "libdnf-plugin-subscription-manager", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "libeconf": [ { "arch": "x86_64", "epoch": null, "name": "libeconf", "release": "2.el9", "source": "rpm", "version": "0.4.1" } ], "libedit": [ { "arch": "x86_64", "epoch": null, "name": "libedit", "release": "37.20210216cvs.el9", "source": "rpm", "version": "3.1" } ], "libestr": [ { "arch": "x86_64", "epoch": null, "name": "libestr", "release": "4.el9", "source": "rpm", "version": "0.1.11" } ], "libev": [ { "arch": "x86_64", "epoch": null, "name": "libev", "release": "5.el9", "source": "rpm", "version": "4.33" } ], "libevent": [ { "arch": "x86_64", "epoch": null, "name": "libevent", "release": "6.el9", "source": "rpm", "version": "2.1.12" } ], "libfastjson": [ { "arch": "x86_64", "epoch": null, "name": "libfastjson", "release": "3.el9", "source": "rpm", "version": "0.99.9" } ], "libfdisk": [ { "arch": "x86_64", "epoch": null, "name": "libfdisk", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libffi": [ { "arch": "x86_64", "epoch": null, "name": "libffi", "release": "7.el9", "source": "rpm", "version": "3.4.2" } ], "libfido2": [ { "arch": "x86_64", "epoch": null, "name": "libfido2", "release": "7.el9", "source": "rpm", "version": "1.6.0" } ], "libgcab1": [ { "arch": "x86_64", "epoch": null, "name": "libgcab1", "release": "6.el9", "source": "rpm", "version": "1.4" } ], "libgcc": [ { "arch": "x86_64", "epoch": null, "name": "libgcc", "release": "2.el9", "source": "rpm", "version": "11.3.1" } ], "libgcrypt": [ { "arch": "x86_64", "epoch": null, "name": "libgcrypt", "release": "4.el9_0", "source": "rpm", "version": "1.10.0" } ], "libgomp": [ { "arch": "x86_64", "epoch": null, "name": "libgomp", "release": "2.el9", "source": "rpm", "version": "11.3.1" } ], "libgpg-error": [ { "arch": "x86_64", "epoch": null, "name": "libgpg-error", "release": "5.el9", "source": "rpm", "version": "1.42" } ], "libgudev": [ { "arch": "x86_64", "epoch": null, "name": "libgudev", "release": "1.el9", "source": "rpm", "version": "237" } ], "libgusb": [ { "arch": "x86_64", "epoch": null, "name": "libgusb", "release": "1.el9", "source": "rpm", "version": "0.3.8" } ], "libibverbs": [ { "arch": "x86_64", "epoch": null, "name": "libibverbs", "release": "1.el9", "source": "rpm", "version": "37.2" } ], "libicu": [ { "arch": "x86_64", "epoch": null, "name": "libicu", "release": "9.el9", "source": "rpm", "version": "67.1" } ], "libidn2": [ { "arch": "x86_64", "epoch": null, "name": "libidn2", "release": "7.el9", "source": "rpm", "version": "2.3.0" } ], "libini_config": [ { "arch": "x86_64", "epoch": null, "name": "libini_config", "release": "53.el9", "source": "rpm", "version": "1.3.1" } ], "libjcat": [ { "arch": "x86_64", "epoch": null, "name": "libjcat", "release": "3.el9", "source": "rpm", "version": "0.1.6" } ], "libjpeg-turbo": [ { "arch": "x86_64", "epoch": null, "name": "libjpeg-turbo", "release": "5.el9", "source": "rpm", "version": "2.0.90" } ], "libkcapi": [ { "arch": "x86_64", "epoch": null, "name": "libkcapi", "release": "3.el9", "source": "rpm", "version": "1.3.1" } ], "libkcapi-hmaccalc": [ { "arch": "x86_64", "epoch": null, "name": "libkcapi-hmaccalc", "release": "3.el9", "source": "rpm", "version": "1.3.1" } ], "libksba": [ { "arch": "x86_64", "epoch": null, "name": "libksba", "release": "4.el9", "source": "rpm", "version": "1.5.1" } ], "libldb": [ { "arch": "x86_64", "epoch": null, "name": "libldb", "release": "1.el9", "source": "rpm", "version": "2.5.0" } ], "libmaxminddb": [ { "arch": "x86_64", "epoch": null, "name": "libmaxminddb", "release": "3.el9", "source": "rpm", "version": "1.5.2" } ], "libmnl": [ { "arch": "x86_64", "epoch": null, "name": "libmnl", "release": "15.el9", "source": "rpm", "version": "1.0.4" } ], "libmodulemd": [ { "arch": "x86_64", "epoch": null, "name": "libmodulemd", "release": "2.el9", "source": "rpm", "version": "2.13.0" } ], "libmount": [ { "arch": "x86_64", "epoch": null, "name": "libmount", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libndp": [ { "arch": "x86_64", "epoch": null, "name": "libndp", "release": "4.el9", "source": "rpm", "version": "1.8" } ], "libnetfilter_conntrack": [ { "arch": "x86_64", "epoch": null, "name": "libnetfilter_conntrack", "release": "4.el9", "source": "rpm", "version": "1.0.8" } ], "libnfnetlink": [ { "arch": "x86_64", "epoch": null, "name": "libnfnetlink", "release": "21.el9", "source": "rpm", "version": "1.0.1" } ], "libnfsidmap": [ { "arch": "x86_64", "epoch": 1, "name": "libnfsidmap", "release": "10.el9", "source": "rpm", "version": "2.5.4" } ], "libnghttp2": [ { "arch": "x86_64", "epoch": null, "name": "libnghttp2", "release": "5.el9", "source": "rpm", "version": "1.43.0" } ], "libnl3": [ { "arch": "x86_64", "epoch": null, "name": "libnl3", "release": "2.el9", "source": "rpm", "version": "3.6.0" } ], "libnl3-cli": [ { "arch": "x86_64", "epoch": null, "name": "libnl3-cli", "release": "2.el9", "source": "rpm", "version": "3.6.0" } ], "libpath_utils": [ { "arch": "x86_64", "epoch": null, "name": "libpath_utils", "release": "53.el9", "source": "rpm", "version": "0.2.1" } ], "libpcap": [ { "arch": "x86_64", "epoch": 14, "name": "libpcap", "release": "4.el9", "source": "rpm", "version": "1.10.0" } ], "libpipeline": [ { "arch": "x86_64", "epoch": null, "name": "libpipeline", "release": "4.el9", "source": "rpm", "version": "1.5.3" } ], "libpng": [ { "arch": "x86_64", "epoch": 2, "name": "libpng", "release": "12.el9", "source": "rpm", "version": "1.6.37" } ], "libproxy": [ { "arch": "x86_64", "epoch": null, "name": "libproxy", "release": "35.el9", "source": "rpm", "version": "0.4.15" } ], "libproxy-webkitgtk4": [ { "arch": "x86_64", "epoch": null, "name": "libproxy-webkitgtk4", "release": "35.el9", "source": "rpm", "version": "0.4.15" } ], "libpsl": [ { "arch": "x86_64", "epoch": null, "name": "libpsl", "release": "5.el9", "source": "rpm", "version": "0.21.1" } ], "libpwquality": [ { "arch": "x86_64", "epoch": null, "name": "libpwquality", "release": "8.el9", "source": "rpm", "version": "1.4.4" } ], "libref_array": [ { "arch": "x86_64", "epoch": null, "name": "libref_array", "release": "53.el9", "source": "rpm", "version": "0.1.5" } ], "librepo": [ { "arch": "x86_64", "epoch": null, "name": "librepo", "release": "1.el9", "source": "rpm", "version": "1.14.2" } ], "libreport-filesystem": [ { "arch": "noarch", "epoch": null, "name": "libreport-filesystem", "release": "6.el9", "source": "rpm", "version": "2.15.2" } ], "librhsm": [ { "arch": "x86_64", "epoch": null, "name": "librhsm", "release": "7.el9", "source": "rpm", "version": "0.0.3" } ], "libseccomp": [ { "arch": "x86_64", "epoch": null, "name": "libseccomp", "release": "2.el9", "source": "rpm", "version": "2.5.2" } ], "libselinux": [ { "arch": "x86_64", "epoch": null, "name": "libselinux", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "libselinux-utils": [ { "arch": "x86_64", "epoch": null, "name": "libselinux-utils", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "libsemanage": [ { "arch": "x86_64", "epoch": null, "name": "libsemanage", "release": "3.el9", "source": "rpm", "version": "3.3" } ], "libsepol": [ { "arch": "x86_64", "epoch": null, "name": "libsepol", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "libsigsegv": [ { "arch": "x86_64", "epoch": null, "name": "libsigsegv", "release": "4.el9", "source": "rpm", "version": "2.13" } ], "libsmartcols": [ { "arch": "x86_64", "epoch": null, "name": "libsmartcols", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libsmbios": [ { "arch": "x86_64", "epoch": null, "name": "libsmbios", "release": "4.el9", "source": "rpm", "version": "2.4.3" } ], "libsolv": [ { "arch": "x86_64", "epoch": null, "name": "libsolv", "release": "1.el9", "source": "rpm", "version": "0.7.22" } ], "libsoup": [ { "arch": "x86_64", "epoch": null, "name": "libsoup", "release": "8.el9", "source": "rpm", "version": "2.72.0" } ], "libss": [ { "arch": "x86_64", "epoch": null, "name": "libss", "release": "2.el9", "source": "rpm", "version": "1.46.5" } ], "libssh": [ { "arch": "x86_64", "epoch": null, "name": "libssh", "release": "3.el9", "source": "rpm", "version": "0.9.6" } ], "libssh-config": [ { "arch": "noarch", "epoch": null, "name": "libssh-config", "release": "3.el9", "source": "rpm", "version": "0.9.6" } ], "libsss_certmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_certmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libsss_idmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_idmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libsss_nss_idmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_nss_idmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libsss_sudo": [ { "arch": "x86_64", "epoch": null, "name": "libsss_sudo", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "libstdc++": [ { "arch": "x86_64", "epoch": null, "name": "libstdc++", "release": "2.el9", "source": "rpm", "version": "11.3.1" } ], "libstemmer": [ { "arch": "x86_64", "epoch": null, "name": "libstemmer", "release": "18.585svn.el9", "source": "rpm", "version": "0" } ], "libsysfs": [ { "arch": "x86_64", "epoch": null, "name": "libsysfs", "release": "10.el9", "source": "rpm", "version": "2.1.1" } ], "libtalloc": [ { "arch": "x86_64", "epoch": null, "name": "libtalloc", "release": "1.el9", "source": "rpm", "version": "2.3.3" } ], "libtasn1": [ { "arch": "x86_64", "epoch": null, "name": "libtasn1", "release": "7.el9", "source": "rpm", "version": "4.16.0" } ], "libtdb": [ { "arch": "x86_64", "epoch": null, "name": "libtdb", "release": "1.el9", "source": "rpm", "version": "1.4.6" } ], "libteam": [ { "arch": "x86_64", "epoch": null, "name": "libteam", "release": "11.el9", "source": "rpm", "version": "1.31" } ], "libtevent": [ { "arch": "x86_64", "epoch": null, "name": "libtevent", "release": "0.el9", "source": "rpm", "version": "0.12.0" } ], "libtirpc": [ { "arch": "x86_64", "epoch": null, "name": "libtirpc", "release": "1.el9", "source": "rpm", "version": "1.3.2" } ], "libunistring": [ { "arch": "x86_64", "epoch": null, "name": "libunistring", "release": "15.el9", "source": "rpm", "version": "0.9.10" } ], "libusbx": [ { "arch": "x86_64", "epoch": null, "name": "libusbx", "release": "1.el9", "source": "rpm", "version": "1.0.26" } ], "libuser": [ { "arch": "x86_64", "epoch": null, "name": "libuser", "release": "10.el9", "source": "rpm", "version": "0.63" } ], "libutempter": [ { "arch": "x86_64", "epoch": null, "name": "libutempter", "release": "6.el9", "source": "rpm", "version": "1.2.1" } ], "libuuid": [ { "arch": "x86_64", "epoch": null, "name": "libuuid", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "libverto": [ { "arch": "x86_64", "epoch": null, "name": "libverto", "release": "3.el9", "source": "rpm", "version": "0.3.2" } ], "libverto-libev": [ { "arch": "x86_64", "epoch": null, "name": "libverto-libev", "release": "3.el9", "source": "rpm", "version": "0.3.2" } ], "libxcrypt": [ { "arch": "x86_64", "epoch": null, "name": "libxcrypt", "release": "3.el9", "source": "rpm", "version": "4.4.18" } ], "libxcrypt-compat": [ { "arch": "x86_64", "epoch": null, "name": "libxcrypt-compat", "release": "3.el9", "source": "rpm", "version": "4.4.18" } ], "libxml2": [ { "arch": "x86_64", "epoch": null, "name": "libxml2", "release": "2.el9", "source": "rpm", "version": "2.9.13" } ], "libxmlb": [ { "arch": "x86_64", "epoch": null, "name": "libxmlb", "release": "1.el9", "source": "rpm", "version": "0.3.3" } ], "libyaml": [ { "arch": "x86_64", "epoch": null, "name": "libyaml", "release": "7.el9", "source": "rpm", "version": "0.2.5" } ], "libzstd": [ { "arch": "x86_64", "epoch": null, "name": "libzstd", "release": "2.el9", "source": "rpm", "version": "1.5.1" } ], "linux-firmware": [ { "arch": "noarch", "epoch": null, "name": "linux-firmware", "release": "126.el9", "source": "rpm", "version": "20220509" } ], "linux-firmware-whence": [ { "arch": "noarch", "epoch": null, "name": "linux-firmware-whence", "release": "126.el9", "source": "rpm", "version": "20220509" } ], "lmdb-libs": [ { "arch": "x86_64", "epoch": null, "name": "lmdb-libs", "release": "3.el9", "source": "rpm", "version": "0.9.29" } ], "logrotate": [ { "arch": "x86_64", "epoch": null, "name": "logrotate", "release": "5.el9", "source": "rpm", "version": "3.18.0" } ], "lshw": [ { "arch": "x86_64", "epoch": null, "name": "lshw", "release": "7.el9", "source": "rpm", "version": "B.02.19.2" } ], "lsof": [ { "arch": "x86_64", "epoch": null, "name": "lsof", "release": "3.el9", "source": "rpm", "version": "4.94.0" } ], "lsscsi": [ { "arch": "x86_64", "epoch": null, "name": "lsscsi", "release": "6.el9", "source": "rpm", "version": "0.32" } ], "lua-libs": [ { "arch": "x86_64", "epoch": null, "name": "lua-libs", "release": "4.el9", "source": "rpm", "version": "5.4.2" } ], "lvm2": [ { "arch": "x86_64", "epoch": 9, "name": "lvm2", "release": "4.el9", "source": "rpm", "version": "2.03.14" } ], "lvm2-libs": [ { "arch": "x86_64", "epoch": 9, "name": "lvm2-libs", "release": "4.el9", "source": "rpm", "version": "2.03.14" } ], "lz4-libs": [ { "arch": "x86_64", "epoch": null, "name": "lz4-libs", "release": "5.el9", "source": "rpm", "version": "1.9.3" } ], "lzo": [ { "arch": "x86_64", "epoch": null, "name": "lzo", "release": "7.el9", "source": "rpm", "version": "2.10" } ], "man-db": [ { "arch": "x86_64", "epoch": null, "name": "man-db", "release": "6.el9", "source": "rpm", "version": "2.9.3" } ], "mdadm": [ { "arch": "x86_64", "epoch": null, "name": "mdadm", "release": "2.el9", "source": "rpm", "version": "4.2" } ], "microcode_ctl": [ { "arch": "noarch", "epoch": 4, "name": "microcode_ctl", "release": "1.el9", "source": "rpm", "version": "20220207" } ], "mokutil": [ { "arch": "x86_64", "epoch": 2, "name": "mokutil", "release": "9.el9", "source": "rpm", "version": "0.4.0" } ], "mpfr": [ { "arch": "x86_64", "epoch": null, "name": "mpfr", "release": "7.el9", "source": "rpm", "version": "4.1.0" } ], "ncurses": [ { "arch": "x86_64", "epoch": null, "name": "ncurses", "release": "8.20210508.el9", "source": "rpm", "version": "6.2" } ], "ncurses-base": [ { "arch": "noarch", "epoch": null, "name": "ncurses-base", "release": "8.20210508.el9", "source": "rpm", "version": "6.2" } ], "ncurses-libs": [ { "arch": "x86_64", "epoch": null, "name": "ncurses-libs", "release": "8.20210508.el9", "source": "rpm", "version": "6.2" } ], "ndctl": [ { "arch": "x86_64", "epoch": null, "name": "ndctl", "release": "6.el9", "source": "rpm", "version": "71.1" } ], "ndctl-libs": [ { "arch": "x86_64", "epoch": null, "name": "ndctl-libs", "release": "6.el9", "source": "rpm", "version": "71.1" } ], "nettle": [ { "arch": "x86_64", "epoch": null, "name": "nettle", "release": "2.el9", "source": "rpm", "version": "3.7.3" } ], "newt": [ { "arch": "x86_64", "epoch": null, "name": "newt", "release": "11.el9", "source": "rpm", "version": "0.52.21" } ], "nfs-utils": [ { "arch": "x86_64", "epoch": 1, "name": "nfs-utils", "release": "10.el9", "source": "rpm", "version": "2.5.4" } ], "npth": [ { "arch": "x86_64", "epoch": null, "name": "npth", "release": "8.el9", "source": "rpm", "version": "1.6" } ], "nspr": [ { "arch": "x86_64", "epoch": null, "name": "nspr", "release": "9.el9", "source": "rpm", "version": "4.32.0" } ], "nss": [ { "arch": "x86_64", "epoch": null, "name": "nss", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-softokn": [ { "arch": "x86_64", "epoch": null, "name": "nss-softokn", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-softokn-freebl": [ { "arch": "x86_64", "epoch": null, "name": "nss-softokn-freebl", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-sysinit": [ { "arch": "x86_64", "epoch": null, "name": "nss-sysinit", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "nss-util": [ { "arch": "x86_64", "epoch": null, "name": "nss-util", "release": "7.el9", "source": "rpm", "version": "3.71.0" } ], "numactl-libs": [ { "arch": "x86_64", "epoch": null, "name": "numactl-libs", "release": "8.el9", "source": "rpm", "version": "2.0.14" } ], "oddjob": [ { "arch": "x86_64", "epoch": null, "name": "oddjob", "release": "5.el9", "source": "rpm", "version": "0.34.7" } ], "oddjob-mkhomedir": [ { "arch": "x86_64", "epoch": null, "name": "oddjob-mkhomedir", "release": "5.el9", "source": "rpm", "version": "0.34.7" } ], "openldap": [ { "arch": "x86_64", "epoch": null, "name": "openldap", "release": "5.el9", "source": "rpm", "version": "2.4.59" } ], "openssh": [ { "arch": "x86_64", "epoch": null, "name": "openssh", "release": "8.el9", "source": "rpm", "version": "8.7p1" } ], "openssh-clients": [ { "arch": "x86_64", "epoch": null, "name": "openssh-clients", "release": "8.el9", "source": "rpm", "version": "8.7p1" } ], "openssh-server": [ { "arch": "x86_64", "epoch": null, "name": "openssh-server", "release": "8.el9", "source": "rpm", "version": "8.7p1" } ], "openssl": [ { "arch": "x86_64", "epoch": 1, "name": "openssl", "release": "33.el9_0", "source": "rpm", "version": "3.0.1" } ], "openssl-libs": [ { "arch": "x86_64", "epoch": 1, "name": "openssl-libs", "release": "33.el9_0", "source": "rpm", "version": "3.0.1" } ], "openssl-pkcs11": [ { "arch": "x86_64", "epoch": null, "name": "openssl-pkcs11", "release": "7.el9", "source": "rpm", "version": "0.4.11" } ], "os-prober": [ { "arch": "x86_64", "epoch": null, "name": "os-prober", "release": "9.el9", "source": "rpm", "version": "1.77" } ], "p11-kit": [ { "arch": "x86_64", "epoch": null, "name": "p11-kit", "release": "2.el9", "source": "rpm", "version": "0.24.1" } ], "p11-kit-trust": [ { "arch": "x86_64", "epoch": null, "name": "p11-kit-trust", "release": "2.el9", "source": "rpm", "version": "0.24.1" } ], "pam": [ { "arch": "x86_64", "epoch": null, "name": "pam", "release": "11.el9", "source": "rpm", "version": "1.5.1" } ], "parted": [ { "arch": "x86_64", "epoch": null, "name": "parted", "release": "6.el9", "source": "rpm", "version": "3.4" } ], "passwd": [ { "arch": "x86_64", "epoch": null, "name": "passwd", "release": "12.el9", "source": "rpm", "version": "0.80" } ], "pciutils": [ { "arch": "x86_64", "epoch": null, "name": "pciutils", "release": "5.el9", "source": "rpm", "version": "3.7.0" } ], "pciutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "pciutils-libs", "release": "5.el9", "source": "rpm", "version": "3.7.0" } ], "pcre": [ { "arch": "x86_64", "epoch": null, "name": "pcre", "release": "3.el9.3", "source": "rpm", "version": "8.44" } ], "pcre2": [ { "arch": "x86_64", "epoch": null, "name": "pcre2", "release": "2.el9", "source": "rpm", "version": "10.40" } ], "pcre2-syntax": [ { "arch": "noarch", "epoch": null, "name": "pcre2-syntax", "release": "2.el9", "source": "rpm", "version": "10.40" } ], "pigz": [ { "arch": "x86_64", "epoch": null, "name": "pigz", "release": "4.el9", "source": "rpm", "version": "2.5" } ], "policycoreutils": [ { "arch": "x86_64", "epoch": null, "name": "policycoreutils", "release": "6.el9_0", "source": "rpm", "version": "3.3" } ], "policycoreutils-python-utils": [ { "arch": "noarch", "epoch": null, "name": "policycoreutils-python-utils", "release": "6.el9_0", "source": "rpm", "version": "3.3" } ], "polkit": [ { "arch": "x86_64", "epoch": null, "name": "polkit", "release": "10.el9_0", "source": "rpm", "version": "0.117" } ], "polkit-libs": [ { "arch": "x86_64", "epoch": null, "name": "polkit-libs", "release": "10.el9_0", "source": "rpm", "version": "0.117" } ], "polkit-pkla-compat": [ { "arch": "x86_64", "epoch": null, "name": "polkit-pkla-compat", "release": "21.el9", "source": "rpm", "version": "0.1" } ], "popt": [ { "arch": "x86_64", "epoch": null, "name": "popt", "release": "8.el9", "source": "rpm", "version": "1.18" } ], "prefixdevname": [ { "arch": "x86_64", "epoch": null, "name": "prefixdevname", "release": "8.el9", "source": "rpm", "version": "0.1.0" } ], "procps-ng": [ { "arch": "x86_64", "epoch": null, "name": "procps-ng", "release": "5.el9", "source": "rpm", "version": "3.3.17" } ], "protobuf-c": [ { "arch": "x86_64", "epoch": null, "name": "protobuf-c", "release": "12.el9", "source": "rpm", "version": "1.3.3" } ], "psmisc": [ { "arch": "x86_64", "epoch": null, "name": "psmisc", "release": "3.el9", "source": "rpm", "version": "23.4" } ], "publicsuffix-list-dafsa": [ { "arch": "noarch", "epoch": null, "name": "publicsuffix-list-dafsa", "release": "3.el9", "source": "rpm", "version": "20210518" } ], "python-unversioned-command": [ { "arch": "noarch", "epoch": null, "name": "python-unversioned-command", "release": "2.el9", "source": "rpm", "version": "3.9.10" } ], "python3": [ { "arch": "x86_64", "epoch": null, "name": "python3", "release": "2.el9", "source": "rpm", "version": "3.9.10" } ], "python3-attrs": [ { "arch": "noarch", "epoch": null, "name": "python3-attrs", "release": "7.el9", "source": "rpm", "version": "20.3.0" } ], "python3-audit": [ { "arch": "x86_64", "epoch": null, "name": "python3-audit", "release": "102.el9", "source": "rpm", "version": "3.0.7" } ], "python3-babel": [ { "arch": "noarch", "epoch": null, "name": "python3-babel", "release": "2.el9", "source": "rpm", "version": "2.9.1" } ], "python3-blivet": [ { "arch": "noarch", "epoch": 1, "name": "python3-blivet", "release": "13.el9_0", "source": "rpm", "version": "3.4.0" } ], "python3-blockdev": [ { "arch": "x86_64", "epoch": null, "name": "python3-blockdev", "release": "12.el9", "source": "rpm", "version": "2.25" } ], "python3-bytesize": [ { "arch": "x86_64", "epoch": null, "name": "python3-bytesize", "release": "3.el9", "source": "rpm", "version": "2.5" } ], "python3-chardet": [ { "arch": "noarch", "epoch": null, "name": "python3-chardet", "release": "5.el9", "source": "rpm", "version": "4.0.0" } ], "python3-cloud-what": [ { "arch": "x86_64", "epoch": null, "name": "python3-cloud-what", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "python3-configobj": [ { "arch": "noarch", "epoch": null, "name": "python3-configobj", "release": "25.el9", "source": "rpm", "version": "5.0.6" } ], "python3-dasbus": [ { "arch": "noarch", "epoch": null, "name": "python3-dasbus", "release": "5.el9", "source": "rpm", "version": "1.4" } ], "python3-dateutil": [ { "arch": "noarch", "epoch": 1, "name": "python3-dateutil", "release": "6.el9", "source": "rpm", "version": "2.8.1" } ], "python3-dbus": [ { "arch": "x86_64", "epoch": null, "name": "python3-dbus", "release": "2.el9", "source": "rpm", "version": "1.2.18" } ], "python3-decorator": [ { "arch": "noarch", "epoch": null, "name": "python3-decorator", "release": "6.el9", "source": "rpm", "version": "4.4.2" } ], "python3-distro": [ { "arch": "noarch", "epoch": null, "name": "python3-distro", "release": "7.el9", "source": "rpm", "version": "1.5.0" } ], "python3-dmidecode": [ { "arch": "x86_64", "epoch": null, "name": "python3-dmidecode", "release": "27.el9", "source": "rpm", "version": "3.12.2" } ], "python3-dnf": [ { "arch": "noarch", "epoch": null, "name": "python3-dnf", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "python3-dnf-plugins-core": [ { "arch": "noarch", "epoch": null, "name": "python3-dnf-plugins-core", "release": "1.el9", "source": "rpm", "version": "4.1.0" } ], "python3-ethtool": [ { "arch": "x86_64", "epoch": null, "name": "python3-ethtool", "release": "2.el9", "source": "rpm", "version": "0.15" } ], "python3-file-magic": [ { "arch": "noarch", "epoch": null, "name": "python3-file-magic", "release": "8.el9", "source": "rpm", "version": "5.39" } ], "python3-gobject-base": [ { "arch": "x86_64", "epoch": null, "name": "python3-gobject-base", "release": "5.el9", "source": "rpm", "version": "3.40.1" } ], "python3-gpg": [ { "arch": "x86_64", "epoch": null, "name": "python3-gpg", "release": "6.el9", "source": "rpm", "version": "1.15.1" } ], "python3-hawkey": [ { "arch": "x86_64", "epoch": null, "name": "python3-hawkey", "release": "1.el9", "source": "rpm", "version": "0.67.0" } ], "python3-idna": [ { "arch": "noarch", "epoch": null, "name": "python3-idna", "release": "7.el9", "source": "rpm", "version": "2.10" } ], "python3-iniparse": [ { "arch": "noarch", "epoch": null, "name": "python3-iniparse", "release": "45.el9", "source": "rpm", "version": "0.4" } ], "python3-inotify": [ { "arch": "noarch", "epoch": null, "name": "python3-inotify", "release": "25.el9", "source": "rpm", "version": "0.9.6" } ], "python3-jinja2": [ { "arch": "noarch", "epoch": null, "name": "python3-jinja2", "release": "4.el9", "source": "rpm", "version": "2.11.3" } ], "python3-jsonpatch": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonpatch", "release": "16.el9", "source": "rpm", "version": "1.21" } ], "python3-jsonpointer": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonpointer", "release": "4.el9", "source": "rpm", "version": "2.0" } ], "python3-jsonschema": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonschema", "release": "13.el9", "source": "rpm", "version": "3.2.0" } ], "python3-libcomps": [ { "arch": "x86_64", "epoch": null, "name": "python3-libcomps", "release": "1.el9", "source": "rpm", "version": "0.1.18" } ], "python3-libdnf": [ { "arch": "x86_64", "epoch": null, "name": "python3-libdnf", "release": "1.el9", "source": "rpm", "version": "0.67.0" } ], "python3-librepo": [ { "arch": "x86_64", "epoch": null, "name": "python3-librepo", "release": "1.el9", "source": "rpm", "version": "1.14.2" } ], "python3-libs": [ { "arch": "x86_64", "epoch": null, "name": "python3-libs", "release": "2.el9", "source": "rpm", "version": "3.9.10" } ], "python3-libselinux": [ { "arch": "x86_64", "epoch": null, "name": "python3-libselinux", "release": "2.el9", "source": "rpm", "version": "3.3" } ], "python3-libsemanage": [ { "arch": "x86_64", "epoch": null, "name": "python3-libsemanage", "release": "3.el9", "source": "rpm", "version": "3.3" } ], "python3-libxml2": [ { "arch": "x86_64", "epoch": null, "name": "python3-libxml2", "release": "2.el9", "source": "rpm", "version": "2.9.13" } ], "python3-linux-procfs": [ { "arch": "noarch", "epoch": null, "name": "python3-linux-procfs", "release": "1.el9", "source": "rpm", "version": "0.7.0" } ], "python3-markupsafe": [ { "arch": "x86_64", "epoch": null, "name": "python3-markupsafe", "release": "12.el9", "source": "rpm", "version": "1.1.1" } ], "python3-netifaces": [ { "arch": "x86_64", "epoch": null, "name": "python3-netifaces", "release": "15.el9", "source": "rpm", "version": "0.10.6" } ], "python3-oauthlib": [ { "arch": "noarch", "epoch": null, "name": "python3-oauthlib", "release": "2.el9", "source": "rpm", "version": "3.1.1" } ], "python3-perf": [ { "arch": "x86_64", "epoch": null, "name": "python3-perf", "release": "101.el9", "source": "rpm", "version": "5.14.0" } ], "python3-pexpect": [ { "arch": "noarch", "epoch": null, "name": "python3-pexpect", "release": "7.el9", "source": "rpm", "version": "4.8.0" } ], "python3-pip-wheel": [ { "arch": "noarch", "epoch": null, "name": "python3-pip-wheel", "release": "6.el9", "source": "rpm", "version": "21.2.3" } ], "python3-policycoreutils": [ { "arch": "noarch", "epoch": null, "name": "python3-policycoreutils", "release": "6.el9_0", "source": "rpm", "version": "3.3" } ], "python3-prettytable": [ { "arch": "noarch", "epoch": null, "name": "python3-prettytable", "release": "27.el9", "source": "rpm", "version": "0.7.2" } ], "python3-ptyprocess": [ { "arch": "noarch", "epoch": null, "name": "python3-ptyprocess", "release": "12.el9", "source": "rpm", "version": "0.6.0" } ], "python3-pyparted": [ { "arch": "x86_64", "epoch": 1, "name": "python3-pyparted", "release": "4.el9", "source": "rpm", "version": "3.11.7" } ], "python3-pyrsistent": [ { "arch": "x86_64", "epoch": null, "name": "python3-pyrsistent", "release": "8.el9", "source": "rpm", "version": "0.17.3" } ], "python3-pyserial": [ { "arch": "noarch", "epoch": null, "name": "python3-pyserial", "release": "12.el9", "source": "rpm", "version": "3.4" } ], "python3-pysocks": [ { "arch": "noarch", "epoch": null, "name": "python3-pysocks", "release": "12.el9", "source": "rpm", "version": "1.7.1" } ], "python3-pytz": [ { "arch": "noarch", "epoch": null, "name": "python3-pytz", "release": "4.el9", "source": "rpm", "version": "2021.1" } ], "python3-pyudev": [ { "arch": "noarch", "epoch": null, "name": "python3-pyudev", "release": "6.el9", "source": "rpm", "version": "0.22.0" } ], "python3-pyyaml": [ { "arch": "x86_64", "epoch": null, "name": "python3-pyyaml", "release": "6.el9", "source": "rpm", "version": "5.4.1" } ], "python3-requests": [ { "arch": "noarch", "epoch": null, "name": "python3-requests", "release": "6.el9", "source": "rpm", "version": "2.25.1" } ], "python3-rpm": [ { "arch": "x86_64", "epoch": null, "name": "python3-rpm", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "python3-setools": [ { "arch": "x86_64", "epoch": null, "name": "python3-setools", "release": "4.el9", "source": "rpm", "version": "4.4.0" } ], "python3-setuptools": [ { "arch": "noarch", "epoch": null, "name": "python3-setuptools", "release": "10.el9", "source": "rpm", "version": "53.0.0" } ], "python3-setuptools-wheel": [ { "arch": "noarch", "epoch": null, "name": "python3-setuptools-wheel", "release": "10.el9", "source": "rpm", "version": "53.0.0" } ], "python3-six": [ { "arch": "noarch", "epoch": null, "name": "python3-six", "release": "9.el9", "source": "rpm", "version": "1.15.0" } ], "python3-subscription-manager-rhsm": [ { "arch": "x86_64", "epoch": null, "name": "python3-subscription-manager-rhsm", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "python3-systemd": [ { "arch": "x86_64", "epoch": null, "name": "python3-systemd", "release": "18.el9", "source": "rpm", "version": "234" } ], "python3-urllib3": [ { "arch": "noarch", "epoch": null, "name": "python3-urllib3", "release": "3.el9", "source": "rpm", "version": "1.26.5" } ], "qemu-guest-agent": [ { "arch": "x86_64", "epoch": 17, "name": "qemu-guest-agent", "release": "4.el9", "source": "rpm", "version": "7.0.0" } ], "quota": [ { "arch": "x86_64", "epoch": 1, "name": "quota", "release": "6.el9", "source": "rpm", "version": "4.06" } ], "quota-nls": [ { "arch": "noarch", "epoch": 1, "name": "quota-nls", "release": "6.el9", "source": "rpm", "version": "4.06" } ], "readline": [ { "arch": "x86_64", "epoch": null, "name": "readline", "release": "4.el9", "source": "rpm", "version": "8.1" } ], "redhat-logos": [ { "arch": "x86_64", "epoch": null, "name": "redhat-logos", "release": "1.el9", "source": "rpm", "version": "90.4" } ], "redhat-release": [ { "arch": "x86_64", "epoch": null, "name": "redhat-release", "release": "1.3.el9", "source": "rpm", "version": "9.1" } ], "redhat-release-eula": [ { "arch": "x86_64", "epoch": null, "name": "redhat-release-eula", "release": "1.3.el9", "source": "rpm", "version": "9.1" } ], "rhsm-icons": [ { "arch": "noarch", "epoch": null, "name": "rhsm-icons", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "rootfiles": [ { "arch": "noarch", "epoch": null, "name": "rootfiles", "release": "31.el9", "source": "rpm", "version": "8.1" } ], "rpcbind": [ { "arch": "x86_64", "epoch": null, "name": "rpcbind", "release": "2.el9", "source": "rpm", "version": "1.2.6" } ], "rpm": [ { "arch": "x86_64", "epoch": null, "name": "rpm", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-build-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-build-libs", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-libs", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-plugin-audit": [ { "arch": "x86_64", "epoch": null, "name": "rpm-plugin-audit", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-plugin-selinux": [ { "arch": "x86_64", "epoch": null, "name": "rpm-plugin-selinux", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-plugin-systemd-inhibit": [ { "arch": "x86_64", "epoch": null, "name": "rpm-plugin-systemd-inhibit", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rpm-sign-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-sign-libs", "release": "12.el9_0", "source": "rpm", "version": "4.16.1.3" } ], "rsync": [ { "arch": "x86_64", "epoch": null, "name": "rsync", "release": "11.el9", "source": "rpm", "version": "3.2.3" } ], "rsyslog": [ { "arch": "x86_64", "epoch": null, "name": "rsyslog", "release": "105.el9", "source": "rpm", "version": "8.2102.0" } ], "rsyslog-logrotate": [ { "arch": "x86_64", "epoch": null, "name": "rsyslog-logrotate", "release": "105.el9", "source": "rpm", "version": "8.2102.0" } ], "sed": [ { "arch": "x86_64", "epoch": null, "name": "sed", "release": "9.el9", "source": "rpm", "version": "4.8" } ], "selinux-policy": [ { "arch": "noarch", "epoch": null, "name": "selinux-policy", "release": "1.el9", "source": "rpm", "version": "34.1.33" } ], "selinux-policy-targeted": [ { "arch": "noarch", "epoch": null, "name": "selinux-policy-targeted", "release": "1.el9", "source": "rpm", "version": "34.1.33" } ], "setroubleshoot-plugins": [ { "arch": "noarch", "epoch": null, "name": "setroubleshoot-plugins", "release": "4.el9", "source": "rpm", "version": "3.3.14" } ], "setroubleshoot-server": [ { "arch": "x86_64", "epoch": null, "name": "setroubleshoot-server", "release": "3.el9_0", "source": "rpm", "version": "3.3.28" } ], "setup": [ { "arch": "noarch", "epoch": null, "name": "setup", "release": "6.el9", "source": "rpm", "version": "2.13.7" } ], "sg3_utils": [ { "arch": "x86_64", "epoch": null, "name": "sg3_utils", "release": "8.el9", "source": "rpm", "version": "1.47" } ], "sg3_utils-libs": [ { "arch": "x86_64", "epoch": null, "name": "sg3_utils-libs", "release": "8.el9", "source": "rpm", "version": "1.47" } ], "shadow-utils": [ { "arch": "x86_64", "epoch": 2, "name": "shadow-utils", "release": "4.el9", "source": "rpm", "version": "4.9" } ], "shared-mime-info": [ { "arch": "x86_64", "epoch": null, "name": "shared-mime-info", "release": "4.el9", "source": "rpm", "version": "2.1" } ], "shim-x64": [ { "arch": "x86_64", "epoch": null, "name": "shim-x64", "release": "2.el9", "source": "rpm", "version": "15.5" } ], "slang": [ { "arch": "x86_64", "epoch": null, "name": "slang", "release": "11.el9", "source": "rpm", "version": "2.3.2" } ], "snappy": [ { "arch": "x86_64", "epoch": null, "name": "snappy", "release": "8.el9", "source": "rpm", "version": "1.1.8" } ], "sos": [ { "arch": "noarch", "epoch": null, "name": "sos", "release": "1.el9", "source": "rpm", "version": "4.3" } ], "sqlite-libs": [ { "arch": "x86_64", "epoch": null, "name": "sqlite-libs", "release": "5.el9", "source": "rpm", "version": "3.34.1" } ], "squashfs-tools": [ { "arch": "x86_64", "epoch": null, "name": "squashfs-tools", "release": "8.git1.el9", "source": "rpm", "version": "4.4" } ], "sscg": [ { "arch": "x86_64", "epoch": null, "name": "sscg", "release": "5.el9", "source": "rpm", "version": "3.0.0" } ], "sssd-client": [ { "arch": "x86_64", "epoch": null, "name": "sssd-client", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "sssd-common": [ { "arch": "x86_64", "epoch": null, "name": "sssd-common", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "sssd-kcm": [ { "arch": "x86_64", "epoch": null, "name": "sssd-kcm", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "sssd-nfs-idmap": [ { "arch": "x86_64", "epoch": null, "name": "sssd-nfs-idmap", "release": "2.el9", "source": "rpm", "version": "2.7.0" } ], "subscription-manager": [ { "arch": "x86_64", "epoch": null, "name": "subscription-manager", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "subscription-manager-cockpit": [ { "arch": "noarch", "epoch": null, "name": "subscription-manager-cockpit", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "subscription-manager-rhsm-certificates": [ { "arch": "x86_64", "epoch": null, "name": "subscription-manager-rhsm-certificates", "release": "3.el9_0", "source": "rpm", "version": "1.29.26" } ], "sudo": [ { "arch": "x86_64", "epoch": null, "name": "sudo", "release": "7.el9", "source": "rpm", "version": "1.9.5p2" } ], "systemd": [ { "arch": "x86_64", "epoch": null, "name": "systemd", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-libs": [ { "arch": "x86_64", "epoch": null, "name": "systemd-libs", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-pam": [ { "arch": "x86_64", "epoch": null, "name": "systemd-pam", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-rpm-macros": [ { "arch": "noarch", "epoch": null, "name": "systemd-rpm-macros", "release": "7.el9", "source": "rpm", "version": "250" } ], "systemd-udev": [ { "arch": "x86_64", "epoch": null, "name": "systemd-udev", "release": "7.el9", "source": "rpm", "version": "250" } ], "tar": [ { "arch": "x86_64", "epoch": 2, "name": "tar", "release": "3.el9", "source": "rpm", "version": "1.34" } ], "tcpdump": [ { "arch": "x86_64", "epoch": 14, "name": "tcpdump", "release": "6.el9", "source": "rpm", "version": "4.99.0" } ], "teamd": [ { "arch": "x86_64", "epoch": null, "name": "teamd", "release": "11.el9", "source": "rpm", "version": "1.31" } ], "tpm2-tss": [ { "arch": "x86_64", "epoch": null, "name": "tpm2-tss", "release": "7.el9", "source": "rpm", "version": "3.0.3" } ], "tuned": [ { "arch": "noarch", "epoch": null, "name": "tuned", "release": "2.el9", "source": "rpm", "version": "2.18.0" } ], "tzdata": [ { "arch": "noarch", "epoch": null, "name": "tzdata", "release": "1.el9", "source": "rpm", "version": "2022a" } ], "usermode": [ { "arch": "x86_64", "epoch": null, "name": "usermode", "release": "4.el9", "source": "rpm", "version": "1.114" } ], "userspace-rcu": [ { "arch": "x86_64", "epoch": null, "name": "userspace-rcu", "release": "6.el9", "source": "rpm", "version": "0.12.1" } ], "util-linux": [ { "arch": "x86_64", "epoch": null, "name": "util-linux", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "util-linux-core": [ { "arch": "x86_64", "epoch": null, "name": "util-linux-core", "release": "3.el9", "source": "rpm", "version": "2.37.4" } ], "vdo": [ { "arch": "x86_64", "epoch": null, "name": "vdo", "release": "1.el9", "source": "rpm", "version": "8.1.1.360" } ], "vim-minimal": [ { "arch": "x86_64", "epoch": 2, "name": "vim-minimal", "release": "16.el9_0.2", "source": "rpm", "version": "8.2.2637" } ], "virt-what": [ { "arch": "x86_64", "epoch": null, "name": "virt-what", "release": "1.el9", "source": "rpm", "version": "1.22" } ], "volume_key-libs": [ { "arch": "x86_64", "epoch": null, "name": "volume_key-libs", "release": "15.el9", "source": "rpm", "version": "0.3.12" } ], "webkit2gtk3-jsc": [ { "arch": "x86_64", "epoch": null, "name": "webkit2gtk3-jsc", "release": "1.el9", "source": "rpm", "version": "2.36.1" } ], "which": [ { "arch": "x86_64", "epoch": null, "name": "which", "release": "28.el9", "source": "rpm", "version": "2.21" } ], "xfsprogs": [ { "arch": "x86_64", "epoch": null, "name": "xfsprogs", "release": "1.el9", "source": "rpm", "version": "5.14.2" } ], "xz": [ { "arch": "x86_64", "epoch": null, "name": "xz", "release": "7.el9", "source": "rpm", "version": "5.2.5" } ], "xz-libs": [ { "arch": "x86_64", "epoch": null, "name": "xz-libs", "release": "7.el9", "source": "rpm", "version": "5.2.5" } ], "yum": [ { "arch": "noarch", "epoch": null, "name": "yum", "release": "2.el9", "source": "rpm", "version": "4.12.0" } ], "yum-utils": [ { "arch": "noarch", "epoch": null, "name": "yum-utils", "release": "1.el9", "source": "rpm", "version": "4.1.0" } ], "zlib": [ { "arch": "x86_64", "epoch": null, "name": "zlib", "release": "33.el9", "source": "rpm", "version": "1.2.11" } ], "zstd": [ { "arch": "x86_64", "epoch": null, "name": "zstd", "release": "2.el9", "source": "rpm", "version": "1.5.1" } ] } }, "changed": false } TASK [Set blivet package name] ************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:29 Wednesday 01 June 2022 16:39:01 +0000 (0:00:01.198) 0:00:12.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "blivet_pkg_name": [ "python3-blivet" ] }, "changed": false } TASK [Set blivet package version] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:33 Wednesday 01 June 2022 16:39:01 +0000 (0:00:00.085) 0:00:12.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "blivet_pkg_version": "3.4.0-13.el9_0" }, "changed": false } TASK [Check if kvdo is loadable] *********************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:37 Wednesday 01 June 2022 16:39:01 +0000 (0:00:00.082) 0:00:12.327 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "cmd": "set -euo pipefail\nmodprobe --dry-run kvdo\n", "delta": "0:00:00.005136", "end": "2022-06-01 12:39:01.594866", "rc": 1, "start": "2022-06-01 12:39:01.589730" } STDERR: modprobe: FATAL: Module kvdo not found in directory /lib/modules/5.14.0-101.el9.x86_64 MSG: non-zero return code ...ignoring TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:46 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.526) 0:00:12.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create LVM VDO volume under volume group 'pool1'] ************************ task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:51 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.082) 0:00:12.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:66 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.079) 0:00:13.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:68 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.079) 0:00:13.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:83 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.080) 0:00:13.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove LVM VDO volume in 'pool1' created above] ************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:85 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.086) 0:00:13.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:101 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.081) 0:00:13.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create LVM VDO volume under volume group 'pool1' (this time default size)] *** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:103 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.082) 0:00:13.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:117 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.078) 0:00:13.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove LVM VDO volume in 'pool1' created above] ************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:119 Wednesday 01 June 2022 16:39:02 +0000 (0:00:00.094) 0:00:13.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:134 Wednesday 01 June 2022 16:39:03 +0000 (0:00:00.094) 0:00:13.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=29 changed=0 unreachable=0 failed=0 skipped=23 rescued=0 ignored=1 Wednesday 01 June 2022 16:39:03 +0000 (0:00:00.036) 0:00:13.732 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gather package facts ---------------------------------------------------- 1.20s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:25 ----------------- Gathering Facts --------------------------------------------------------- 1.03s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove_scsi_generated.yml:3 --- linux-system-roles.storage : make sure blivet is available -------------- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.79s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:2 ------------------ linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Check if kvdo is loadable ----------------------------------------------- 0.53s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:37 ----------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.45s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ Remove LVM VDO volume in 'pool1' created above -------------------------- 0.09s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:119 ---------------- include_tasks ----------------------------------------------------------- 0.09s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:117 ---------------- include_tasks ----------------------------------------------------------- 0.09s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:83 ----------------- Set blivet package name ------------------------------------------------- 0.09s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:29 ----------------- Set blivet package version ---------------------------------------------- 0.08s /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml:33 ----------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:39:03 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:39:05 +0000 (0:00:01.242) 0:00:01.265 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_partition_volume_then_remove.yml ************************ 1 plays in /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:2 Wednesday 01 June 2022 16:39:05 +0000 (0:00:00.014) 0:00:01.280 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:9 Wednesday 01 June 2022 16:39:06 +0000 (0:00:01.055) 0:00:02.336 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:39:06 +0000 (0:00:00.039) 0:00:02.375 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:39:06 +0000 (0:00:00.147) 0:00:02.522 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:39:06 +0000 (0:00:00.521) 0:00:03.044 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:39:06 +0000 (0:00:00.073) 0:00:03.117 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:39:06 +0000 (0:00:00.022) 0:00:03.140 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:39:06 +0000 (0:00:00.022) 0:00:03.162 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:39:07 +0000 (0:00:00.193) 0:00:03.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:39:07 +0000 (0:00:00.019) 0:00:03.375 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:39:08 +0000 (0:00:01.054) 0:00:04.430 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:39:08 +0000 (0:00:00.047) 0:00:04.478 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:39:08 +0000 (0:00:00.043) 0:00:04.521 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:39:09 +0000 (0:00:00.689) 0:00:05.211 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:39:09 +0000 (0:00:00.079) 0:00:05.290 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:39:09 +0000 (0:00:00.020) 0:00:05.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:39:09 +0000 (0:00:00.021) 0:00:05.333 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:39:09 +0000 (0:00:00.018) 0:00:05.351 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:39:09 +0000 (0:00:00.810) 0:00:06.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:39:11 +0000 (0:00:01.791) 0:00:07.954 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:39:11 +0000 (0:00:00.043) 0:00:07.997 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:39:11 +0000 (0:00:00.026) 0:00:08.024 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.498) 0:00:08.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.029) 0:00:08.551 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.054) 0:00:08.605 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.033) 0:00:08.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.032) 0:00:08.671 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.041) 0:00:08.712 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.029) 0:00:08.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.028) 0:00:08.770 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.029) 0:00:08.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:39:12 +0000 (0:00:00.029) 0:00:08.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:39:13 +0000 (0:00:00.482) 0:00:09.312 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:39:13 +0000 (0:00:00.028) 0:00:09.341 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:12 Wednesday 01 June 2022 16:39:13 +0000 (0:00:00.796) 0:00:10.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:19 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.029) 0:00:10.166 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.045) 0:00:10.212 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.500) 0:00:10.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.034) 0:00:10.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.028) 0:00:10.776 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a partition device mounted on "/opt/test1"] *********************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:23 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.031) 0:00:10.808 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.053) 0:00:10.862 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:39:14 +0000 (0:00:00.040) 0:00:10.902 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.494) 0:00:11.397 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.072) 0:00:11.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.030) 0:00:11.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.030) 0:00:11.531 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.059) 0:00:11.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.024) 0:00:11.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.028) 0:00:11.644 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.034) 0:00:11.679 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.030) 0:00:11.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.028) 0:00:11.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.064) 0:00:11.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.029) 0:00:11.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.028) 0:00:11.861 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.042) 0:00:11.903 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:39:15 +0000 (0:00:00.026) 0:00:11.930 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:39:17 +0000 (0:00:01.711) 0:00:13.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:39:17 +0000 (0:00:00.030) 0:00:13.672 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:39:17 +0000 (0:00:00.027) 0:00:13.700 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:39:17 +0000 (0:00:00.042) 0:00:13.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:39:17 +0000 (0:00:00.036) 0:00:13.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:39:17 +0000 (0:00:00.032) 0:00:13.811 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:39:17 +0000 (0:00:00.029) 0:00:13.841 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:39:18 +0000 (0:00:00.921) 0:00:14.763 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:39:19 +0000 (0:00:00.546) 0:00:15.310 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:39:19 +0000 (0:00:00.658) 0:00:15.968 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:39:20 +0000 (0:00:00.366) 0:00:16.335 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:39:20 +0000 (0:00:00.029) 0:00:16.364 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:37 Wednesday 01 June 2022 16:39:21 +0000 (0:00:00.820) 0:00:17.185 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:39:21 +0000 (0:00:00.052) 0:00:17.237 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:39:21 +0000 (0:00:00.039) 0:00:17.276 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:39:21 +0000 (0:00:00.029) 0:00:17.305 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "71a4ae25-c35d-46f9-97b7-5b501a54a24b" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:39:21 +0000 (0:00:00.508) 0:00:17.814 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002557", "end": "2022-06-01 12:39:21.587791", "rc": 0, "start": "2022-06-01 12:39:21.585234" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.508) 0:00:18.322 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002366", "end": "2022-06-01 12:39:21.948614", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:39:21.946248" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.362) 0:00:18.684 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.070) 0:00:18.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.030) 0:00:18.786 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.108) 0:00:18.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.031) 0:00:18.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.030) 0:00:18.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.030) 0:00:18.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.030) 0:00:19.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.034) 0:00:19.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.031) 0:00:19.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.029) 0:00:19.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:39:22 +0000 (0:00:00.030) 0:00:19.143 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.053) 0:00:19.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.031) 0:00:19.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.031) 0:00:19.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.038) 0:00:19.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.033) 0:00:19.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.029) 0:00:19.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.029) 0:00:19.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.030) 0:00:19.421 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.029) 0:00:19.450 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.056) 0:00:19.507 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.038) 0:00:19.546 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.059) 0:00:19.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.033) 0:00:19.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.030) 0:00:19.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.030) 0:00:19.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.030) 0:00:19.731 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.060) 0:00:19.792 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.039) 0:00:19.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.029) 0:00:19.862 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.055) 0:00:19.917 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.032) 0:00:19.950 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.117) 0:00:20.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.034) 0:00:20.101 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "71a4ae25-c35d-46f9-97b7-5b501a54a24b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "71a4ae25-c35d-46f9-97b7-5b501a54a24b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:39:23 +0000 (0:00:00.043) 0:00:20.145 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.038) 0:00:20.183 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.033) 0:00:20.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.034) 0:00:20.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.029) 0:00:20.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.028) 0:00:20.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.031) 0:00:20.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.031) 0:00:20.372 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.044) 0:00:20.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.034) 0:00:20.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.036) 0:00:20.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.029) 0:00:20.517 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.029) 0:00:20.547 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.037) 0:00:20.584 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.036) 0:00:20.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101556.8361216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101556.8361216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5187, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101556.8361216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.408) 0:00:21.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.037) 0:00:21.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.036) 0:00:21.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:39:24 +0000 (0:00:00.042) 0:00:21.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.031) 0:00:21.177 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.039) 0:00:21.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.031) 0:00:21.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.032) 0:00:21.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.031) 0:00:21.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.040) 0:00:21.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.031) 0:00:21.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.048) 0:00:21.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.065) 0:00:21.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.053) 0:00:21.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.059) 0:00:21.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.062) 0:00:21.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.057) 0:00:21.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.067) 0:00:21.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.032) 0:00:21.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.031) 0:00:21.864 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.031) 0:00:21.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.030) 0:00:21.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.029) 0:00:21.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.032) 0:00:21.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.031) 0:00:22.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.030) 0:00:22.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.030) 0:00:22.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.032) 0:00:22.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:39:25 +0000 (0:00:00.030) 0:00:22.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.033) 0:00:22.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.030) 0:00:22.208 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.033) 0:00:22.241 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.030) 0:00:22.272 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.030) 0:00:22.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.029) 0:00:22.332 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.031) 0:00:22.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.029) 0:00:22.392 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.032) 0:00:22.425 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.033) 0:00:22.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.031) 0:00:22.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.031) 0:00:22.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.032) 0:00:22.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.030) 0:00:22.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.031) 0:00:22.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.034) 0:00:22.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.030) 0:00:22.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.030) 0:00:22.711 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.034) 0:00:22.746 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.031) 0:00:22.777 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.031) 0:00:22.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation minus fs_type to verify idempotence] ****** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:39 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.030) 0:00:22.840 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.066) 0:00:22.906 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:39:26 +0000 (0:00:00.042) 0:00:22.949 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.513) 0:00:23.463 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.070) 0:00:23.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.073) 0:00:23.608 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.029) 0:00:23.637 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.062) 0:00:23.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.025) 0:00:23.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.031) 0:00:23.758 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.037) 0:00:23.795 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.033) 0:00:23.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.030) 0:00:23.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.029) 0:00:23.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.029) 0:00:23.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.030) 0:00:23.949 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.043) 0:00:23.992 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:39:27 +0000 (0:00:00.027) 0:00:24.020 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:39:29 +0000 (0:00:01.188) 0:00:25.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:39:29 +0000 (0:00:00.031) 0:00:25.240 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:39:29 +0000 (0:00:00.028) 0:00:25.269 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:39:29 +0000 (0:00:00.037) 0:00:25.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:39:29 +0000 (0:00:00.036) 0:00:25.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:39:29 +0000 (0:00:00.035) 0:00:25.379 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:39:29 +0000 (0:00:00.028) 0:00:25.407 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:39:29 +0000 (0:00:00.648) 0:00:26.055 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:39:30 +0000 (0:00:00.384) 0:00:26.440 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:39:30 +0000 (0:00:00.652) 0:00:27.093 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:39:31 +0000 (0:00:00.360) 0:00:27.453 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:39:31 +0000 (0:00:00.028) 0:00:27.482 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert file system is preserved on existing partition volume] ************ task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:52 Wednesday 01 June 2022 16:39:32 +0000 (0:00:00.825) 0:00:28.307 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:57 Wednesday 01 June 2022 16:39:32 +0000 (0:00:00.035) 0:00:28.343 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:39:32 +0000 (0:00:00.086) 0:00:28.430 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:39:32 +0000 (0:00:00.038) 0:00:28.468 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:39:32 +0000 (0:00:00.030) 0:00:28.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "71a4ae25-c35d-46f9-97b7-5b501a54a24b" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:39:32 +0000 (0:00:00.374) 0:00:28.873 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003049", "end": "2022-06-01 12:39:32.518074", "rc": 0, "start": "2022-06-01 12:39:32.515025" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.381) 0:00:29.254 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002376", "end": "2022-06-01 12:39:32.876816", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:39:32.874440" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.357) 0:00:29.612 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.062) 0:00:29.675 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.029) 0:00:29.704 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.058) 0:00:29.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.029) 0:00:29.792 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.027) 0:00:29.819 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.027) 0:00:29.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.030) 0:00:29.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.029) 0:00:29.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.029) 0:00:29.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.029) 0:00:29.965 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.026) 0:00:29.992 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.055) 0:00:30.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.028) 0:00:30.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.029) 0:00:30.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:39:33 +0000 (0:00:00.030) 0:00:30.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.032) 0:00:30.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.029) 0:00:30.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.030) 0:00:30.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.030) 0:00:30.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.031) 0:00:30.290 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.057) 0:00:30.348 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.042) 0:00:30.391 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.062) 0:00:30.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.035) 0:00:30.488 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.028) 0:00:30.517 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.027) 0:00:30.544 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.032) 0:00:30.576 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.060) 0:00:30.637 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.039) 0:00:30.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.029) 0:00:30.705 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.061) 0:00:30.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.033) 0:00:30.801 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.147) 0:00:30.949 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.034) 0:00:30.983 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "71a4ae25-c35d-46f9-97b7-5b501a54a24b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "71a4ae25-c35d-46f9-97b7-5b501a54a24b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.040) 0:00:31.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.035) 0:00:31.060 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.035) 0:00:31.095 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.036) 0:00:31.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:39:34 +0000 (0:00:00.029) 0:00:31.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.028) 0:00:31.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.028) 0:00:31.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.030) 0:00:31.250 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.046) 0:00:31.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.034) 0:00:31.331 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.035) 0:00:31.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.027) 0:00:31.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.029) 0:00:31.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.038) 0:00:31.462 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.038) 0:00:31.500 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101568.2731216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101568.2731216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5187, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101568.2731216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.388) 0:00:31.889 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.034) 0:00:31.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.036) 0:00:31.960 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.034) 0:00:31.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.032) 0:00:32.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.035) 0:00:32.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.032) 0:00:32.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.028) 0:00:32.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:39:35 +0000 (0:00:00.030) 0:00:32.153 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.047) 0:00:32.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.041) 0:00:32.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.031) 0:00:32.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.031) 0:00:32.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.027) 0:00:32.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.027) 0:00:32.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.039) 0:00:32.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.039) 0:00:32.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.028) 0:00:32.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.029) 0:00:32.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.027) 0:00:32.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.031) 0:00:32.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.029) 0:00:32.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.029) 0:00:32.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.030) 0:00:32.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.033) 0:00:32.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.030) 0:00:32.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.031) 0:00:32.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.030) 0:00:32.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.039) 0:00:32.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.030) 0:00:32.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.033) 0:00:32.876 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.032) 0:00:32.908 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.029) 0:00:32.937 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.029) 0:00:32.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.031) 0:00:32.998 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.030) 0:00:33.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.029) 0:00:33.057 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.033) 0:00:33.091 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.032) 0:00:33.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:39:36 +0000 (0:00:00.028) 0:00:33.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.029) 0:00:33.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.029) 0:00:33.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.079) 0:00:33.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.031) 0:00:33.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.030) 0:00:33.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.029) 0:00:33.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.029) 0:00:33.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.029) 0:00:33.442 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.031) 0:00:33.473 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.026) 0:00:33.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the partition created above] ************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:59 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.029) 0:00:33.529 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.074) 0:00:33.604 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:39:37 +0000 (0:00:00.044) 0:00:33.648 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.525) 0:00:34.174 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.070) 0:00:34.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.031) 0:00:34.276 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.030) 0:00:34.307 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.059) 0:00:34.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.026) 0:00:34.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.031) 0:00:34.423 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.037) 0:00:34.460 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.031) 0:00:34.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.030) 0:00:34.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.029) 0:00:34.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.030) 0:00:34.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.031) 0:00:34.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.040) 0:00:34.654 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:39:38 +0000 (0:00:00.028) 0:00:34.683 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:39:40 +0000 (0:00:01.532) 0:00:36.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:39:40 +0000 (0:00:00.031) 0:00:36.247 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:39:40 +0000 (0:00:00.029) 0:00:36.276 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:39:40 +0000 (0:00:00.039) 0:00:36.315 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:39:40 +0000 (0:00:00.037) 0:00:36.353 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:39:40 +0000 (0:00:00.034) 0:00:36.388 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:39:40 +0000 (0:00:00.386) 0:00:36.774 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:39:41 +0000 (0:00:00.653) 0:00:37.428 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:39:41 +0000 (0:00:00.030) 0:00:37.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:39:41 +0000 (0:00:00.623) 0:00:38.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:39:42 +0000 (0:00:00.352) 0:00:38.435 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:39:42 +0000 (0:00:00.028) 0:00:38.464 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:74 Wednesday 01 June 2022 16:39:43 +0000 (0:00:00.824) 0:00:39.289 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:39:43 +0000 (0:00:00.058) 0:00:39.347 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:39:43 +0000 (0:00:00.039) 0:00:39.387 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:39:43 +0000 (0:00:00.031) 0:00:39.418 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:39:43 +0000 (0:00:00.388) 0:00:39.807 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003235", "end": "2022-06-01 12:39:43.443496", "rc": 0, "start": "2022-06-01 12:39:43.440261" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.377) 0:00:40.185 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002542", "end": "2022-06-01 12:39:43.804034", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:39:43.801492" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.354) 0:00:40.539 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.062) 0:00:40.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.030) 0:00:40.632 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.059) 0:00:40.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.029) 0:00:40.721 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.028) 0:00:40.750 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.026) 0:00:40.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.029) 0:00:40.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.029) 0:00:40.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.033) 0:00:40.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.030) 0:00:40.900 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.083) 0:00:40.983 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.055) 0:00:41.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.029) 0:00:41.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.029) 0:00:41.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.029) 0:00:41.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:39:44 +0000 (0:00:00.030) 0:00:41.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.029) 0:00:41.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.029) 0:00:41.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.029) 0:00:41.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.030) 0:00:41.276 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.058) 0:00:41.334 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.041) 0:00:41.376 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.076) 0:00:41.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.036) 0:00:41.489 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.029) 0:00:41.518 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.027) 0:00:41.545 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.028) 0:00:41.573 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.058) 0:00:41.631 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=71a4ae25-c35d-46f9-97b7-5b501a54a24b", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.037) 0:00:41.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.027) 0:00:41.696 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.074) 0:00:41.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.038) 0:00:41.809 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.116) 0:00:41.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.035) 0:00:41.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.042) 0:00:42.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.033) 0:00:42.037 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.036) 0:00:42.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.029) 0:00:42.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:39:45 +0000 (0:00:00.029) 0:00:42.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.031) 0:00:42.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.029) 0:00:42.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.033) 0:00:42.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.045) 0:00:42.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.024) 0:00:42.298 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.034) 0:00:42.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.028) 0:00:42.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.029) 0:00:42.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.030) 0:00:42.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.024) 0:00:42.446 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.358) 0:00:42.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.038) 0:00:42.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.027) 0:00:42.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.033) 0:00:42.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.029) 0:00:42.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.028) 0:00:42.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.030) 0:00:42.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.030) 0:00:43.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.030) 0:00:43.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.026) 0:00:43.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.030) 0:00:43.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:39:46 +0000 (0:00:00.032) 0:00:43.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.071) 0:00:43.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.029) 0:00:43.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.028) 0:00:43.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.037) 0:00:43.309 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.033) 0:00:43.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.028) 0:00:43.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.029) 0:00:43.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.028) 0:00:43.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.028) 0:00:43.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.028) 0:00:43.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.028) 0:00:43.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.028) 0:00:43.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.031) 0:00:43.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.030) 0:00:43.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.032) 0:00:43.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.035) 0:00:43.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.030) 0:00:43.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.031) 0:00:43.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.036) 0:00:43.773 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.036) 0:00:43.810 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.032) 0:00:43.842 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.031) 0:00:43.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.029) 0:00:43.904 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.029) 0:00:43.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.032) 0:00:43.966 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.034) 0:00:44.001 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.035) 0:00:44.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.034) 0:00:44.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.032) 0:00:44.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:39:47 +0000 (0:00:00.030) 0:00:44.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.031) 0:00:44.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.028) 0:00:44.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.029) 0:00:44.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.028) 0:00:44.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.028) 0:00:44.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.030) 0:00:44.310 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.030) 0:00:44.341 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.026) 0:00:44.368 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:76 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.028) 0:00:44.397 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.072) 0:00:44.469 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.042) 0:00:44.511 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.491) 0:00:45.002 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.073) 0:00:45.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.030) 0:00:45.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:39:48 +0000 (0:00:00.031) 0:00:45.138 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.060) 0:00:45.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.025) 0:00:45.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.031) 0:00:45.255 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.038) 0:00:45.293 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.032) 0:00:45.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.029) 0:00:45.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.029) 0:00:45.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.027) 0:00:45.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.080) 0:00:45.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.044) 0:00:45.538 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:39:49 +0000 (0:00:00.027) 0:00:45.565 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:39:50 +0000 (0:00:01.134) 0:00:46.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.029) 0:00:46.729 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.028) 0:00:46.757 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.038) 0:00:46.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.037) 0:00:46.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.034) 0:00:46.868 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.029) 0:00:46.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.032) 0:00:46.930 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.028) 0:00:46.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:39:50 +0000 (0:00:00.027) 0:00:46.986 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:39:51 +0000 (0:00:00.369) 0:00:47.356 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:39:51 +0000 (0:00:00.030) 0:00:47.386 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:91 Wednesday 01 June 2022 16:39:52 +0000 (0:00:00.809) 0:00:48.196 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:39:52 +0000 (0:00:00.064) 0:00:48.261 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:39:52 +0000 (0:00:00.041) 0:00:48.302 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:39:52 +0000 (0:00:00.030) 0:00:48.333 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:39:52 +0000 (0:00:00.366) 0:00:48.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002697", "end": "2022-06-01 12:39:52.340844", "rc": 0, "start": "2022-06-01 12:39:52.338147" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:39:52 +0000 (0:00:00.388) 0:00:49.088 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002692", "end": "2022-06-01 12:39:52.718242", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:39:52.715550" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.368) 0:00:49.456 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.061) 0:00:49.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.029) 0:00:49.547 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.060) 0:00:49.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.030) 0:00:49.638 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.028) 0:00:49.666 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.028) 0:00:49.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.030) 0:00:49.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.076) 0:00:49.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.031) 0:00:49.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.029) 0:00:49.863 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.026) 0:00:49.889 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.052) 0:00:49.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.031) 0:00:49.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.029) 0:00:50.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.029) 0:00:50.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.028) 0:00:50.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.029) 0:00:50.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.030) 0:00:50.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:39:53 +0000 (0:00:00.031) 0:00:50.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.028) 0:00:50.181 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.054) 0:00:50.236 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.039) 0:00:50.276 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.060) 0:00:50.337 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.034) 0:00:50.372 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.027) 0:00:50.399 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.027) 0:00:50.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.029) 0:00:50.457 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.064) 0:00:50.521 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.040) 0:00:50.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.029) 0:00:50.591 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.056) 0:00:50.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.033) 0:00:50.682 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.113) 0:00:50.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.038) 0:00:50.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.039) 0:00:50.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.028) 0:00:50.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.033) 0:00:50.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.029) 0:00:50.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.031) 0:00:50.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.041) 0:00:51.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.030) 0:00:51.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.031) 0:00:51.100 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:39:54 +0000 (0:00:00.046) 0:00:51.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.024) 0:00:51.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.037) 0:00:51.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.030) 0:00:51.239 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.030) 0:00:51.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.028) 0:00:51.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.024) 0:00:51.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.388) 0:00:51.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.040) 0:00:51.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.027) 0:00:51.780 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.034) 0:00:51.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.030) 0:00:51.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.025) 0:00:51.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.028) 0:00:51.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.029) 0:00:51.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.031) 0:00:51.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.072) 0:00:52.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.031) 0:00:52.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.029) 0:00:52.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.030) 0:00:52.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:39:55 +0000 (0:00:00.030) 0:00:52.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.033) 0:00:52.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.037) 0:00:52.228 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.033) 0:00:52.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.030) 0:00:52.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.030) 0:00:52.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.030) 0:00:52.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.029) 0:00:52.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.030) 0:00:52.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.030) 0:00:52.677 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.034) 0:00:52.712 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.031) 0:00:52.743 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.032) 0:00:52.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.043) 0:00:52.820 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.040) 0:00:52.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.033) 0:00:52.894 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.036) 0:00:52.930 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.035) 0:00:52.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.034) 0:00:53.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.031) 0:00:53.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.031) 0:00:53.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.031) 0:00:53.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.030) 0:00:53.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:39:56 +0000 (0:00:00.030) 0:00:53.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:39:57 +0000 (0:00:00.033) 0:00:53.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:39:57 +0000 (0:00:00.032) 0:00:53.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:39:57 +0000 (0:00:00.031) 0:00:53.252 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:39:57 +0000 (0:00:00.030) 0:00:53.283 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:39:57 +0000 (0:00:00.028) 0:00:53.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=297 changed=5 unreachable=0 failed=0 skipped=316 rescued=0 ignored=0 Wednesday 01 June 2022 16:39:57 +0000 (0:00:00.015) 0:00:53.327 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.13s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.06s /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:2 -------- linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:39:57 +0000 (0:00:00.024) 0:00:00.024 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:39:59 +0000 (0:00:01.298) 0:00:01.323 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_partition_volume_then_remove_nvme_generated.yml ********* 2 plays in /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:39:59 +0000 (0:00:00.017) 0:00:01.340 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:39:59 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:40:01 +0000 (0:00:01.251) 0:00:01.274 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_partition_volume_then_remove_scsi_generated.yml ********* 2 plays in /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove_scsi_generated.yml:3 Wednesday 01 June 2022 16:40:01 +0000 (0:00:00.016) 0:00:01.290 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove_scsi_generated.yml:7 Wednesday 01 June 2022 16:40:02 +0000 (0:00:01.061) 0:00:02.351 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:2 Wednesday 01 June 2022 16:40:02 +0000 (0:00:00.026) 0:00:02.378 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:9 Wednesday 01 June 2022 16:40:03 +0000 (0:00:00.806) 0:00:03.185 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:40:03 +0000 (0:00:00.040) 0:00:03.225 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:40:03 +0000 (0:00:00.158) 0:00:03.383 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:40:03 +0000 (0:00:00.523) 0:00:03.907 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:40:03 +0000 (0:00:00.078) 0:00:03.986 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:40:03 +0000 (0:00:00.023) 0:00:04.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:40:04 +0000 (0:00:00.023) 0:00:04.033 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:40:04 +0000 (0:00:00.192) 0:00:04.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:40:04 +0000 (0:00:00.020) 0:00:04.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:40:05 +0000 (0:00:01.098) 0:00:05.345 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:40:05 +0000 (0:00:00.049) 0:00:05.394 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:40:05 +0000 (0:00:00.047) 0:00:05.441 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:40:06 +0000 (0:00:00.683) 0:00:06.124 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:40:06 +0000 (0:00:00.087) 0:00:06.211 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:40:06 +0000 (0:00:00.019) 0:00:06.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:40:06 +0000 (0:00:00.021) 0:00:06.253 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:40:06 +0000 (0:00:00.021) 0:00:06.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:40:07 +0000 (0:00:00.777) 0:00:07.052 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:40:08 +0000 (0:00:01.860) 0:00:08.912 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:40:08 +0000 (0:00:00.044) 0:00:08.957 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:40:08 +0000 (0:00:00.028) 0:00:08.985 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.514) 0:00:09.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.030) 0:00:09.530 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.027) 0:00:09.557 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.031) 0:00:09.589 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.031) 0:00:09.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.031) 0:00:09.652 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.028) 0:00:09.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.027) 0:00:09.708 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.026) 0:00:09.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:40:09 +0000 (0:00:00.028) 0:00:09.763 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:40:10 +0000 (0:00:00.456) 0:00:10.220 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:40:10 +0000 (0:00:00.028) 0:00:10.249 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:12 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.801) 0:00:11.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:19 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.033) 0:00:11.085 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.044) 0:00:11.129 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.504) 0:00:11.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.037) 0:00:11.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.031) 0:00:11.702 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a partition device mounted on "/opt/test1"] *********************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:23 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.034) 0:00:11.736 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.054) 0:00:11.790 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:40:11 +0000 (0:00:00.043) 0:00:11.834 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.508) 0:00:12.343 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.068) 0:00:12.411 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.030) 0:00:12.442 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.028) 0:00:12.470 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.062) 0:00:12.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.031) 0:00:12.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.032) 0:00:12.596 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.037) 0:00:12.634 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.034) 0:00:12.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.030) 0:00:12.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.029) 0:00:12.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.028) 0:00:12.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.028) 0:00:12.786 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.042) 0:00:12.829 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:40:12 +0000 (0:00:00.030) 0:00:12.859 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:40:14 +0000 (0:00:01.737) 0:00:14.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:40:14 +0000 (0:00:00.030) 0:00:14.626 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:40:14 +0000 (0:00:00.028) 0:00:14.655 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:40:14 +0000 (0:00:00.039) 0:00:14.694 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:40:14 +0000 (0:00:00.035) 0:00:14.730 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:40:14 +0000 (0:00:00.032) 0:00:14.762 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:40:14 +0000 (0:00:00.029) 0:00:14.791 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:40:15 +0000 (0:00:00.899) 0:00:15.691 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:40:16 +0000 (0:00:00.536) 0:00:16.227 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:40:16 +0000 (0:00:00.667) 0:00:16.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:40:17 +0000 (0:00:00.354) 0:00:17.248 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:40:17 +0000 (0:00:00.031) 0:00:17.280 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:37 Wednesday 01 June 2022 16:40:18 +0000 (0:00:00.857) 0:00:18.138 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:40:18 +0000 (0:00:00.053) 0:00:18.191 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:40:18 +0000 (0:00:00.039) 0:00:18.230 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:40:18 +0000 (0:00:00.029) 0:00:18.260 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "f71ae531-73fb-4dfa-9341-8a7decb36bad" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:40:18 +0000 (0:00:00.471) 0:00:18.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002857", "end": "2022-06-01 12:40:18.599898", "rc": 0, "start": "2022-06-01 12:40:18.597041" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.468) 0:00:19.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002384", "end": "2022-06-01 12:40:18.969924", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:40:18.967540" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.365) 0:00:19.566 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.062) 0:00:19.629 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.030) 0:00:19.659 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.062) 0:00:19.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.030) 0:00:19.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.030) 0:00:19.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.030) 0:00:19.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.029) 0:00:19.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.031) 0:00:19.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.030) 0:00:19.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.030) 0:00:19.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.030) 0:00:19.965 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:40:19 +0000 (0:00:00.056) 0:00:20.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.031) 0:00:20.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.029) 0:00:20.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.029) 0:00:20.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.030) 0:00:20.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.029) 0:00:20.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.037) 0:00:20.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.033) 0:00:20.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.032) 0:00:20.275 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.059) 0:00:20.334 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.040) 0:00:20.375 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.100) 0:00:20.476 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.036) 0:00:20.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.031) 0:00:20.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.031) 0:00:20.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.033) 0:00:20.610 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.064) 0:00:20.675 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.041) 0:00:20.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.032) 0:00:20.749 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.055) 0:00:20.804 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.034) 0:00:20.839 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.126) 0:00:20.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:40:20 +0000 (0:00:00.033) 0:00:21.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "f71ae531-73fb-4dfa-9341-8a7decb36bad" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "f71ae531-73fb-4dfa-9341-8a7decb36bad" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.042) 0:00:21.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.037) 0:00:21.079 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.035) 0:00:21.114 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.036) 0:00:21.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.032) 0:00:21.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.030) 0:00:21.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.030) 0:00:21.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.030) 0:00:21.275 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.044) 0:00:21.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.032) 0:00:21.352 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.037) 0:00:21.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.030) 0:00:21.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.030) 0:00:21.450 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.037) 0:00:21.488 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.038) 0:00:21.526 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101613.9341216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101613.9341216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5421, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101613.9341216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.387) 0:00:21.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.038) 0:00:21.952 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:40:21 +0000 (0:00:00.038) 0:00:21.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.033) 0:00:22.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.029) 0:00:22.054 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.033) 0:00:22.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.029) 0:00:22.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.028) 0:00:22.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.031) 0:00:22.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.035) 0:00:22.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.029) 0:00:22.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.029) 0:00:22.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.029) 0:00:22.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.028) 0:00:22.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.031) 0:00:22.362 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.036) 0:00:22.399 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.035) 0:00:22.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.030) 0:00:22.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.031) 0:00:22.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.031) 0:00:22.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.036) 0:00:22.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.104) 0:00:22.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.033) 0:00:22.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.032) 0:00:22.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.030) 0:00:22.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.028) 0:00:22.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.027) 0:00:22.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.033) 0:00:22.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.034) 0:00:22.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.037) 0:00:22.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.031) 0:00:22.958 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.032) 0:00:22.991 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:40:22 +0000 (0:00:00.030) 0:00:23.022 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.032) 0:00:23.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.031) 0:00:23.086 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.033) 0:00:23.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.033) 0:00:23.153 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.041) 0:00:23.194 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.033) 0:00:23.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.032) 0:00:23.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.030) 0:00:23.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.031) 0:00:23.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.470 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.529 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.026) 0:00:23.556 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation minus fs_type to verify idempotence] ****** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:39 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.029) 0:00:23.585 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.066) 0:00:23.651 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:40:23 +0000 (0:00:00.046) 0:00:23.698 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.510) 0:00:24.208 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.071) 0:00:24.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.031) 0:00:24.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.031) 0:00:24.342 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.065) 0:00:24.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.027) 0:00:24.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.030) 0:00:24.465 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.036) 0:00:24.502 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.034) 0:00:24.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.030) 0:00:24.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.030) 0:00:24.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.028) 0:00:24.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.029) 0:00:24.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.044) 0:00:24.700 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:40:24 +0000 (0:00:00.030) 0:00:24.731 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:40:25 +0000 (0:00:01.135) 0:00:25.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:40:25 +0000 (0:00:00.030) 0:00:25.897 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:40:25 +0000 (0:00:00.029) 0:00:25.927 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:40:25 +0000 (0:00:00.040) 0:00:25.968 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:40:25 +0000 (0:00:00.038) 0:00:26.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:40:26 +0000 (0:00:00.034) 0:00:26.041 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:40:26 +0000 (0:00:00.029) 0:00:26.070 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:40:26 +0000 (0:00:00.679) 0:00:26.750 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:40:27 +0000 (0:00:00.388) 0:00:27.138 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:40:27 +0000 (0:00:00.654) 0:00:27.793 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:40:28 +0000 (0:00:00.382) 0:00:28.175 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:40:28 +0000 (0:00:00.028) 0:00:28.204 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert file system is preserved on existing partition volume] ************ task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:52 Wednesday 01 June 2022 16:40:29 +0000 (0:00:00.828) 0:00:29.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:57 Wednesday 01 June 2022 16:40:29 +0000 (0:00:00.035) 0:00:29.068 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:40:29 +0000 (0:00:00.054) 0:00:29.123 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:40:29 +0000 (0:00:00.040) 0:00:29.163 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:40:29 +0000 (0:00:00.031) 0:00:29.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "f71ae531-73fb-4dfa-9341-8a7decb36bad" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:40:29 +0000 (0:00:00.369) 0:00:29.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002530", "end": "2022-06-01 12:40:29.330476", "rc": 0, "start": "2022-06-01 12:40:29.327946" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:40:29 +0000 (0:00:00.361) 0:00:29.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002396", "end": "2022-06-01 12:40:29.701869", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:40:29.699473" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.373) 0:00:30.299 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.061) 0:00:30.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.029) 0:00:30.390 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.056) 0:00:30.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.027) 0:00:30.474 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.029) 0:00:30.504 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.026) 0:00:30.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.031) 0:00:30.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.030) 0:00:30.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.029) 0:00:30.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.030) 0:00:30.653 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.032) 0:00:30.685 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.056) 0:00:30.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.032) 0:00:30.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.031) 0:00:30.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.032) 0:00:30.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.030) 0:00:30.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.031) 0:00:30.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.030) 0:00:30.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.030) 0:00:30.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:40:30 +0000 (0:00:00.031) 0:00:30.992 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.060) 0:00:31.053 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.041) 0:00:31.095 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.068) 0:00:31.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.036) 0:00:31.200 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.028) 0:00:31.229 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.028) 0:00:31.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.031) 0:00:31.289 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.062) 0:00:31.352 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.039) 0:00:31.392 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.030) 0:00:31.422 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.058) 0:00:31.481 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.035) 0:00:31.516 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.123) 0:00:31.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.037) 0:00:31.678 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "f71ae531-73fb-4dfa-9341-8a7decb36bad" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "f71ae531-73fb-4dfa-9341-8a7decb36bad" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.044) 0:00:31.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.035) 0:00:31.758 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.079) 0:00:31.838 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.039) 0:00:31.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.030) 0:00:31.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.031) 0:00:31.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.030) 0:00:31.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:40:31 +0000 (0:00:00.030) 0:00:32.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.048) 0:00:32.049 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.035) 0:00:32.085 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.036) 0:00:32.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.029) 0:00:32.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.031) 0:00:32.183 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.037) 0:00:32.220 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.038) 0:00:32.259 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101625.0741215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101625.0741215, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5421, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654101625.0741215, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.378) 0:00:32.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.036) 0:00:32.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.038) 0:00:32.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.034) 0:00:32.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.029) 0:00:32.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.033) 0:00:32.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.039) 0:00:32.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.029) 0:00:32.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.036) 0:00:32.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.040) 0:00:32.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.031) 0:00:32.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:40:32 +0000 (0:00:00.030) 0:00:33.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.032) 0:00:33.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.030) 0:00:33.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.031) 0:00:33.114 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.038) 0:00:33.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.035) 0:00:33.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.030) 0:00:33.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.031) 0:00:33.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.029) 0:00:33.281 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.030) 0:00:33.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.029) 0:00:33.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.029) 0:00:33.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.037) 0:00:33.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.037) 0:00:33.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.031) 0:00:33.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.030) 0:00:33.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.029) 0:00:33.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.029) 0:00:33.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.030) 0:00:33.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.033) 0:00:33.631 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.034) 0:00:33.666 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.031) 0:00:33.697 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.031) 0:00:33.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.031) 0:00:33.759 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.030) 0:00:33.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.032) 0:00:33.822 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.035) 0:00:33.858 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.032) 0:00:33.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.029) 0:00:33.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.031) 0:00:33.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.033) 0:00:33.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:40:33 +0000 (0:00:00.037) 0:00:34.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.032) 0:00:34.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.031) 0:00:34.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.030) 0:00:34.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.031) 0:00:34.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.032) 0:00:34.181 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.034) 0:00:34.216 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.075) 0:00:34.291 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the partition created above] ************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:59 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.032) 0:00:34.323 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.073) 0:00:34.396 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.047) 0:00:34.443 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:40:34 +0000 (0:00:00.524) 0:00:34.968 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.073) 0:00:35.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.029) 0:00:35.071 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.033) 0:00:35.104 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.062) 0:00:35.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.026) 0:00:35.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.033) 0:00:35.226 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.040) 0:00:35.266 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.039) 0:00:35.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.031) 0:00:35.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.029) 0:00:35.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.029) 0:00:35.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.031) 0:00:35.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.043) 0:00:35.471 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:40:35 +0000 (0:00:00.030) 0:00:35.501 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:40:37 +0000 (0:00:01.637) 0:00:37.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:40:37 +0000 (0:00:00.032) 0:00:37.171 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:40:37 +0000 (0:00:00.029) 0:00:37.201 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:40:37 +0000 (0:00:00.039) 0:00:37.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:40:37 +0000 (0:00:00.034) 0:00:37.275 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:40:37 +0000 (0:00:00.033) 0:00:37.309 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:40:37 +0000 (0:00:00.380) 0:00:37.690 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:40:38 +0000 (0:00:00.657) 0:00:38.347 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:40:38 +0000 (0:00:00.029) 0:00:38.376 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:40:39 +0000 (0:00:00.648) 0:00:39.025 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:40:39 +0000 (0:00:00.368) 0:00:39.393 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:40:39 +0000 (0:00:00.030) 0:00:39.424 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:74 Wednesday 01 June 2022 16:40:40 +0000 (0:00:00.794) 0:00:40.219 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:40:40 +0000 (0:00:00.057) 0:00:40.276 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:40:40 +0000 (0:00:00.039) 0:00:40.315 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:40:40 +0000 (0:00:00.031) 0:00:40.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:40:40 +0000 (0:00:00.382) 0:00:40.729 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002364", "end": "2022-06-01 12:40:40.493567", "rc": 0, "start": "2022-06-01 12:40:40.491203" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.362) 0:00:41.092 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002840", "end": "2022-06-01 12:40:40.863636", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:40:40.860796" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.369) 0:00:41.461 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.064) 0:00:41.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.031) 0:00:41.557 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.061) 0:00:41.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.031) 0:00:41.651 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.027) 0:00:41.678 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.026) 0:00:41.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.028) 0:00:41.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.033) 0:00:41.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.035) 0:00:41.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.032) 0:00:41.834 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.067) 0:00:41.902 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.058) 0:00:41.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.030) 0:00:41.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:40:41 +0000 (0:00:00.031) 0:00:42.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.032) 0:00:42.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.031) 0:00:42.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.030) 0:00:42.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.029) 0:00:42.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.028) 0:00:42.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.030) 0:00:42.205 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.058) 0:00:42.263 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.041) 0:00:42.304 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.060) 0:00:42.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.036) 0:00:42.401 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.027) 0:00:42.428 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.025) 0:00:42.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.030) 0:00:42.484 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.064) 0:00:42.549 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=f71ae531-73fb-4dfa-9341-8a7decb36bad", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.053) 0:00:42.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.033) 0:00:42.636 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.058) 0:00:42.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.036) 0:00:42.732 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.115) 0:00:42.847 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.036) 0:00:42.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.044) 0:00:42.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.030) 0:00:42.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:40:42 +0000 (0:00:00.036) 0:00:42.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.029) 0:00:43.024 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.029) 0:00:43.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.029) 0:00:43.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.031) 0:00:43.115 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.032) 0:00:43.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.046) 0:00:43.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.025) 0:00:43.219 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.035) 0:00:43.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.030) 0:00:43.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.030) 0:00:43.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.029) 0:00:43.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.025) 0:00:43.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.361) 0:00:43.731 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.036) 0:00:43.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.026) 0:00:43.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.033) 0:00:43.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.030) 0:00:43.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.025) 0:00:43.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.030) 0:00:43.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.030) 0:00:43.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.029) 0:00:43.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:40:43 +0000 (0:00:00.026) 0:00:44.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.087) 0:00:44.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.031) 0:00:44.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.029) 0:00:44.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.038) 0:00:44.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.038) 0:00:44.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.032) 0:00:44.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.029) 0:00:44.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.033) 0:00:44.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.031) 0:00:44.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.029) 0:00:44.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.029) 0:00:44.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.031) 0:00:44.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.032) 0:00:44.722 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.037) 0:00:44.759 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.033) 0:00:44.793 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.032) 0:00:44.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.033) 0:00:44.859 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:44.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.031) 0:00:44.921 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.035) 0:00:44.957 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.033) 0:00:44.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:40:44 +0000 (0:00:00.030) 0:00:45.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.033) 0:00:45.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.032) 0:00:45.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.030) 0:00:45.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.030) 0:00:45.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.030) 0:00:45.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.030) 0:00:45.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.034) 0:00:45.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.040) 0:00:45.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.031) 0:00:45.316 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.029) 0:00:45.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:76 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.030) 0:00:45.375 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.074) 0:00:45.450 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.043) 0:00:45.494 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:40:45 +0000 (0:00:00.508) 0:00:46.003 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.073) 0:00:46.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.031) 0:00:46.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.034) 0:00:46.142 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.064) 0:00:46.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.024) 0:00:46.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.031) 0:00:46.263 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.039) 0:00:46.302 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.032) 0:00:46.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.029) 0:00:46.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.032) 0:00:46.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.075) 0:00:46.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.031) 0:00:46.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.043) 0:00:46.546 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:40:46 +0000 (0:00:00.028) 0:00:46.574 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:40:47 +0000 (0:00:01.107) 0:00:47.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.030) 0:00:47.712 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.027) 0:00:47.739 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.036) 0:00:47.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.037) 0:00:47.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.037) 0:00:47.851 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.030) 0:00:47.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.038) 0:00:47.920 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.029) 0:00:47.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:40:47 +0000 (0:00:00.027) 0:00:47.977 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:40:48 +0000 (0:00:00.394) 0:00:48.372 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:40:48 +0000 (0:00:00.028) 0:00:48.400 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:91 Wednesday 01 June 2022 16:40:49 +0000 (0:00:00.829) 0:00:49.229 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:40:49 +0000 (0:00:00.058) 0:00:49.288 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:40:49 +0000 (0:00:00.037) 0:00:49.326 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:40:49 +0000 (0:00:00.028) 0:00:49.355 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:40:49 +0000 (0:00:00.369) 0:00:49.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003648", "end": "2022-06-01 12:40:49.519404", "rc": 0, "start": "2022-06-01 12:40:49.515756" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.399) 0:00:50.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002680", "end": "2022-06-01 12:40:49.899473", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:40:49.896793" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.375) 0:00:50.500 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.061) 0:00:50.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.030) 0:00:50.593 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.075) 0:00:50.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.033) 0:00:50.702 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.031) 0:00:50.734 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.030) 0:00:50.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.031) 0:00:50.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.031) 0:00:50.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.031) 0:00:50.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.031) 0:00:50.891 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:40:50 +0000 (0:00:00.083) 0:00:50.975 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.056) 0:00:51.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.031) 0:00:51.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.030) 0:00:51.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.029) 0:00:51.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.033) 0:00:51.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.029) 0:00:51.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.031) 0:00:51.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.030) 0:00:51.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.030) 0:00:51.278 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.056) 0:00:51.334 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.039) 0:00:51.374 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.058) 0:00:51.432 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.033) 0:00:51.465 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.029) 0:00:51.495 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.028) 0:00:51.523 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.031) 0:00:51.554 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.061) 0:00:51.615 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.052) 0:00:51.667 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.032) 0:00:51.700 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.059) 0:00:51.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.036) 0:00:51.796 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.119) 0:00:51.915 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.037) 0:00:51.952 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:40:51 +0000 (0:00:00.042) 0:00:51.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.030) 0:00:52.025 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.034) 0:00:52.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.028) 0:00:52.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.030) 0:00:52.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.029) 0:00:52.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.030) 0:00:52.180 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.030) 0:00:52.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.045) 0:00:52.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.025) 0:00:52.281 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.034) 0:00:52.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.028) 0:00:52.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.032) 0:00:52.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.028) 0:00:52.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.025) 0:00:52.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.380) 0:00:52.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.038) 0:00:52.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.026) 0:00:52.876 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.039) 0:00:52.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.032) 0:00:52.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.025) 0:00:52.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:40:52 +0000 (0:00:00.030) 0:00:53.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.034) 0:00:53.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.028) 0:00:53.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.035) 0:00:53.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.078) 0:00:53.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.031) 0:00:53.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.036) 0:00:53.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.034) 0:00:53.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.031) 0:00:53.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.029) 0:00:53.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.029) 0:00:53.467 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.031) 0:00:53.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.029) 0:00:53.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.031) 0:00:53.804 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.033) 0:00:53.837 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.031) 0:00:53.869 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.029) 0:00:53.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.030) 0:00:53.929 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.032) 0:00:53.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:40:53 +0000 (0:00:00.031) 0:00:53.993 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.033) 0:00:54.026 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.031) 0:00:54.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.030) 0:00:54.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.030) 0:00:54.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.033) 0:00:54.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.030) 0:00:54.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.030) 0:00:54.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.030) 0:00:54.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.029) 0:00:54.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.029) 0:00:54.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.033) 0:00:54.338 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.031) 0:00:54.369 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.028) 0:00:54.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=299 changed=5 unreachable=0 failed=0 skipped=316 rescued=0 ignored=0 Wednesday 01 June 2022 16:40:54 +0000 (0:00:00.015) 0:00:54.413 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.14s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.06s /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove_scsi_generated.yml:3 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.81s /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml:2 -------- linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:40:55 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:40:56 +0000 (0:00:01.282) 0:00:01.306 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_raid_pool_then_remove.yml ******************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:2 Wednesday 01 June 2022 16:40:56 +0000 (0:00:00.025) 0:00:01.331 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:17 Wednesday 01 June 2022 16:40:57 +0000 (0:00:01.033) 0:00:02.365 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:40:57 +0000 (0:00:00.041) 0:00:02.406 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:40:57 +0000 (0:00:00.155) 0:00:02.562 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:40:58 +0000 (0:00:00.541) 0:00:03.103 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:40:58 +0000 (0:00:00.075) 0:00:03.179 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:40:58 +0000 (0:00:00.022) 0:00:03.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:40:58 +0000 (0:00:00.020) 0:00:03.221 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:40:58 +0000 (0:00:00.190) 0:00:03.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:40:58 +0000 (0:00:00.018) 0:00:03.430 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:40:59 +0000 (0:00:01.083) 0:00:04.513 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:40:59 +0000 (0:00:00.044) 0:00:04.558 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:40:59 +0000 (0:00:00.045) 0:00:04.603 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:41:00 +0000 (0:00:00.684) 0:00:05.288 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:41:00 +0000 (0:00:00.079) 0:00:05.367 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:41:00 +0000 (0:00:00.021) 0:00:05.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:41:00 +0000 (0:00:00.021) 0:00:05.409 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:41:00 +0000 (0:00:00.018) 0:00:05.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:41:01 +0000 (0:00:00.799) 0:00:06.228 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:41:03 +0000 (0:00:01.840) 0:00:08.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.043) 0:00:08.111 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.056) 0:00:08.168 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.524) 0:00:08.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.029) 0:00:08.722 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.026) 0:00:08.749 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.031) 0:00:08.780 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.032) 0:00:08.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.031) 0:00:08.845 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:41:03 +0000 (0:00:00.028) 0:00:08.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:41:04 +0000 (0:00:00.027) 0:00:08.901 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:41:04 +0000 (0:00:00.025) 0:00:08.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:41:04 +0000 (0:00:00.025) 0:00:08.952 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:41:04 +0000 (0:00:00.442) 0:00:09.394 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:41:04 +0000 (0:00:00.027) 0:00:09.422 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:20 Wednesday 01 June 2022 16:41:05 +0000 (0:00:00.833) 0:00:10.256 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:27 Wednesday 01 June 2022 16:41:05 +0000 (0:00:00.029) 0:00:10.286 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:41:05 +0000 (0:00:00.044) 0:00:10.331 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:41:05 +0000 (0:00:00.521) 0:00:10.853 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.035) 0:00:10.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.028) 0:00:10.916 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a RAID0 device] *************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:31 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.031) 0:00:10.947 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.057) 0:00:11.004 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.042) 0:00:11.047 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.514) 0:00:11.561 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.067) 0:00:11.629 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.030) 0:00:11.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.029) 0:00:11.689 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.061) 0:00:11.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.059) 0:00:11.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:41:06 +0000 (0:00:00.030) 0:00:11.841 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "raid0", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.038) 0:00:11.879 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.035) 0:00:11.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.031) 0:00:11.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.030) 0:00:11.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.031) 0:00:12.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.028) 0:00:12.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.041) 0:00:12.078 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:41:07 +0000 (0:00:00.027) 0:00:12.105 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "mdadm", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:41:15 +0000 (0:00:08.580) 0:00:20.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:41:15 +0000 (0:00:00.031) 0:00:20.717 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:41:15 +0000 (0:00:00.028) 0:00:20.746 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "mdadm", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:41:15 +0000 (0:00:00.050) 0:00:20.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:41:15 +0000 (0:00:00.043) 0:00:20.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:41:15 +0000 (0:00:00.035) 0:00:20.875 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:41:16 +0000 (0:00:00.028) 0:00:20.904 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:41:16 +0000 (0:00:00.933) 0:00:21.837 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:41:18 +0000 (0:00:01.296) 0:00:23.134 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:41:18 +0000 (0:00:00.644) 0:00:23.778 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:41:19 +0000 (0:00:00.364) 0:00:24.142 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:41:19 +0000 (0:00:00.028) 0:00:24.171 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:52 Wednesday 01 June 2022 16:41:20 +0000 (0:00:00.834) 0:00:25.005 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:41:20 +0000 (0:00:00.057) 0:00:25.063 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:41:20 +0000 (0:00:00.045) 0:00:25.109 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:41:20 +0000 (0:00:00.032) 0:00:25.141 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "31f9f766-4693-48e3-b76b-c1ae94232bb1" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "2219ffac-5e06-411e-8fb5-1afc9a85e0f4" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "fa7ef1a8-d4d3-42dd-bb22-536ff9010f7f" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "20G", "type": "raid0", "uuid": "XJJdQO-Loff-INOl-g3M4-LQ1H-h2O7-eCLY9n" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "b671e3e1-8670-e0d5-2eff-d67c8fe44de9" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "b671e3e1-8670-e0d5-2eff-d67c8fe44de9" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:41:20 +0000 (0:00:00.501) 0:00:25.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002850", "end": "2022-06-01 12:41:20.659762", "rc": 0, "start": "2022-06-01 12:41:20.656912" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:41:21 +0000 (0:00:00.469) 0:00:26.112 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003149", "end": "2022-06-01 12:41:21.029229", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:41:21.026080" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:41:21 +0000 (0:00:00.373) 0:00:26.486 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:41:21 +0000 (0:00:00.073) 0:00:26.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:41:21 +0000 (0:00:00.034) 0:00:26.594 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:41:21 +0000 (0:00:00.070) 0:00:26.665 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:41:21 +0000 (0:00:00.048) 0:00:26.713 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.465) 0:00:27.178 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.041) 0:00:27.220 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.036) 0:00:27.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.034) 0:00:27.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.037) 0:00:27.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid0" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.037) 0:00:27.367 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.053) 0:00:27.421 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:41:22 +0000 (0:00:00.062) 0:00:27.483 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.008051", "end": "2022-06-01 12:41:22.425309", "rc": 0, "start": "2022-06-01 12:41:22.417258" } STDOUT: /dev/md/vg1-1: Version : 1.2 Creation Time : Wed Jun 1 12:41:08 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:41:08 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : vg1-1 UUID : b671e3e1:8670e0d5:2effd67c:8fe44de9 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.399) 0:00:27.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.042) 0:00:27.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.042) 0:00:27.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ None\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.042) 0:00:28.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.036) 0:00:28.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.036) 0:00:28.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.039) 0:00:28.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.034) 0:00:28.159 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.060) 0:00:28.219 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.117) 0:00:28.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.031) 0:00:28.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.029) 0:00:28.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.029) 0:00:28.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.029) 0:00:28.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.029) 0:00:28.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.030) 0:00:28.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.029) 0:00:28.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.031) 0:00:28.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.031) 0:00:28.608 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.068) 0:00:28.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.038) 0:00:28.715 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.037) 0:00:28.752 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.059) 0:00:28.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:41:23 +0000 (0:00:00.039) 0:00:28.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.037) 0:00:28.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:28.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.030) 0:00:28.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:28.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.032) 0:00:29.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.034) 0:00:29.049 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.073) 0:00:29.122 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.104) 0:00:29.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.034) 0:00:29.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.033) 0:00:29.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.032) 0:00:29.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.033) 0:00:29.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.038) 0:00:29.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:29.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:29.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.032) 0:00:29.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.030) 0:00:29.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.032) 0:00:29.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:29.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.030) 0:00:29.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.032) 0:00:29.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:29.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:29.719 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.035) 0:00:29.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.033) 0:00:29.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.032) 0:00:29.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:41:24 +0000 (0:00:00.031) 0:00:29.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.031) 0:00:29.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.031) 0:00:29.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.033) 0:00:29.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.030) 0:00:29.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.031) 0:00:30.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.032) 0:00:30.041 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.087) 0:00:30.129 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.041) 0:00:30.170 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.178) 0:00:30.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.038) 0:00:30.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "31f9f766-4693-48e3-b76b-c1ae94232bb1" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "31f9f766-4693-48e3-b76b-c1ae94232bb1" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.044) 0:00:30.431 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.039) 0:00:30.471 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.035) 0:00:30.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.039) 0:00:30.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.031) 0:00:30.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.030) 0:00:30.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.031) 0:00:30.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.030) 0:00:30.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.063) 0:00:30.733 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.066) 0:00:30.800 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:41:25 +0000 (0:00:00.073) 0:00:30.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.066) 0:00:30.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.058) 0:00:30.998 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.068) 0:00:31.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.065) 0:00:31.132 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101675.1461215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101675.1461215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5852, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101675.1461215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.406) 0:00:31.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.040) 0:00:31.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.038) 0:00:31.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.034) 0:00:31.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.031) 0:00:31.682 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.035) 0:00:31.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.031) 0:00:31.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.031) 0:00:31.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.033) 0:00:31.814 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:41:26 +0000 (0:00:00.045) 0:00:31.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.033) 0:00:31.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.032) 0:00:31.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:31.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:31.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.033) 0:00:32.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.039) 0:00:32.062 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.036) 0:00:32.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:32.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:32.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:32.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.033) 0:00:32.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:32.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.032) 0:00:32.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:32.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.031) 0:00:32.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.032) 0:00:32.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.032) 0:00:32.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:41:27 +0000 (0:00:00.038) 0:00:32.457 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.554) 0:00:33.012 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.374) 0:00:33.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.039) 0:00:33.425 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.033) 0:00:33.459 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.032) 0:00:33.491 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.033) 0:00:33.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.032) 0:00:33.557 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.031) 0:00:33.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.031) 0:00:33.619 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.035) 0:00:33.655 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.034) 0:00:33.689 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:41:28 +0000 (0:00:00.040) 0:00:33.730 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.035946", "end": "2022-06-01 12:41:28.704621", "rc": 0, "start": "2022-06-01 12:41:28.668675" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.427) 0:00:34.158 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.040) 0:00:34.198 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.039) 0:00:34.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.033) 0:00:34.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.033) 0:00:34.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.035) 0:00:34.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.032) 0:00:34.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.031) 0:00:34.405 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.035) 0:00:34.441 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.133) 0:00:34.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.036) 0:00:34.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "2219ffac-5e06-411e-8fb5-1afc9a85e0f4" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "2219ffac-5e06-411e-8fb5-1afc9a85e0f4" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.045) 0:00:34.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.039) 0:00:34.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.038) 0:00:34.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.037) 0:00:34.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.031) 0:00:34.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.032) 0:00:34.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:41:29 +0000 (0:00:00.032) 0:00:34.869 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.033) 0:00:34.902 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.047) 0:00:34.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.035) 0:00:34.986 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.037) 0:00:35.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.032) 0:00:35.055 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.030) 0:00:35.086 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.037) 0:00:35.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.037) 0:00:35.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101674.8931215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101674.8931215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5818, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101674.8931215, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.378) 0:00:35.540 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.038) 0:00:35.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.038) 0:00:35.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.036) 0:00:35.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.028) 0:00:35.682 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.033) 0:00:35.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.029) 0:00:35.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.029) 0:00:35.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.030) 0:00:35.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:41:30 +0000 (0:00:00.039) 0:00:35.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.064) 0:00:35.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.031) 0:00:35.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.032) 0:00:35.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.030) 0:00:36.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.028) 0:00:36.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.036) 0:00:36.068 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.036) 0:00:36.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.029) 0:00:36.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.030) 0:00:36.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.029) 0:00:36.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.029) 0:00:36.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.029) 0:00:36.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.032) 0:00:36.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.032) 0:00:36.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.032) 0:00:36.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.032) 0:00:36.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.031) 0:00:36.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.031) 0:00:36.445 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:41:31 +0000 (0:00:00.377) 0:00:36.822 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.366) 0:00:37.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.039) 0:00:37.228 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.034) 0:00:37.263 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.032) 0:00:37.295 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.035) 0:00:37.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.034) 0:00:37.365 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.033) 0:00:37.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.043) 0:00:37.442 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.038) 0:00:37.480 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.036) 0:00:37.517 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:41:32 +0000 (0:00:00.042) 0:00:37.559 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.042648", "end": "2022-06-01 12:41:32.537818", "rc": 0, "start": "2022-06-01 12:41:32.495170" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.438) 0:00:37.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.040) 0:00:38.039 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.042) 0:00:38.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.034) 0:00:38.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.033) 0:00:38.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.034) 0:00:38.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.032) 0:00:38.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.032) 0:00:38.250 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.040) 0:00:38.290 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.122) 0:00:38.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.036) 0:00:38.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "fa7ef1a8-d4d3-42dd-bb22-536ff9010f7f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "fa7ef1a8-d4d3-42dd-bb22-536ff9010f7f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.042) 0:00:38.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.038) 0:00:38.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.035) 0:00:38.566 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.084) 0:00:38.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.032) 0:00:38.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.040) 0:00:38.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.035) 0:00:38.758 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.035) 0:00:38.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:41:33 +0000 (0:00:00.055) 0:00:38.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.037) 0:00:38.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.039) 0:00:38.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.032) 0:00:38.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.032) 0:00:38.991 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.039) 0:00:39.031 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.040) 0:00:39.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101674.6421216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101674.6421216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5783, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101674.6421216, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.384) 0:00:39.456 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.041) 0:00:39.497 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.039) 0:00:39.537 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.036) 0:00:39.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.032) 0:00:39.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.037) 0:00:39.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.035) 0:00:39.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.031) 0:00:39.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.031) 0:00:39.741 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.038) 0:00:39.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.033) 0:00:39.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:41:34 +0000 (0:00:00.032) 0:00:39.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.035) 0:00:39.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.033) 0:00:39.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.033) 0:00:39.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.042) 0:00:39.991 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.035) 0:00:40.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.033) 0:00:40.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.035) 0:00:40.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.034) 0:00:40.129 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.034) 0:00:40.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.034) 0:00:40.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.033) 0:00:40.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.034) 0:00:40.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.036) 0:00:40.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.033) 0:00:40.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.032) 0:00:40.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.033) 0:00:40.402 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:41:35 +0000 (0:00:00.382) 0:00:40.784 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.359) 0:00:41.143 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.039) 0:00:41.183 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.035) 0:00:41.218 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.032) 0:00:41.251 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.035) 0:00:41.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.066) 0:00:41.353 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.033) 0:00:41.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.033) 0:00:41.420 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.039) 0:00:41.459 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.043) 0:00:41.503 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:41:36 +0000 (0:00:00.090) 0:00:41.593 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.036456", "end": "2022-06-01 12:41:36.563571", "rc": 0, "start": "2022-06-01 12:41:36.527115" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.426) 0:00:42.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.040) 0:00:42.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.039) 0:00:42.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.033) 0:00:42.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.033) 0:00:42.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.034) 0:00:42.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.033) 0:00:42.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.032) 0:00:42.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.031) 0:00:42.300 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.028) 0:00:42.328 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:54 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.030) 0:00:42.359 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.062) 0:00:42.421 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:41:37 +0000 (0:00:00.044) 0:00:42.466 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.534) 0:00:43.001 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.070) 0:00:43.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.029) 0:00:43.102 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.029) 0:00:43.131 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.066) 0:00:43.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.027) 0:00:43.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.029) 0:00:43.255 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "raid0", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.038) 0:00:43.294 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.033) 0:00:43.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.029) 0:00:43.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.029) 0:00:43.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.030) 0:00:43.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.033) 0:00:43.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.051) 0:00:43.501 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:41:38 +0000 (0:00:00.030) 0:00:43.532 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "mdadm", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:41:40 +0000 (0:00:02.024) 0:00:45.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:41:40 +0000 (0:00:00.073) 0:00:45.630 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:41:40 +0000 (0:00:00.031) 0:00:45.661 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "mdadm", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:41:40 +0000 (0:00:00.044) 0:00:45.705 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:41:40 +0000 (0:00:00.040) 0:00:45.746 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:41:40 +0000 (0:00:00.036) 0:00:45.782 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:41:40 +0000 (0:00:00.033) 0:00:45.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:41:41 +0000 (0:00:00.670) 0:00:46.487 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:41:42 +0000 (0:00:01.063) 0:00:47.551 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:41:43 +0000 (0:00:00.652) 0:00:48.204 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:41:43 +0000 (0:00:00.358) 0:00:48.562 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:41:43 +0000 (0:00:00.030) 0:00:48.593 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:75 Wednesday 01 June 2022 16:41:44 +0000 (0:00:00.863) 0:00:49.456 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:41:44 +0000 (0:00:00.059) 0:00:49.516 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:41:44 +0000 (0:00:00.055) 0:00:49.571 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:41:44 +0000 (0:00:00.030) 0:00:49.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "31f9f766-4693-48e3-b76b-c1ae94232bb1" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "2219ffac-5e06-411e-8fb5-1afc9a85e0f4" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "fa7ef1a8-d4d3-42dd-bb22-536ff9010f7f" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "20G", "type": "raid0", "uuid": "XJJdQO-Loff-INOl-g3M4-LQ1H-h2O7-eCLY9n" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "b671e3e1-8670-e0d5-2eff-d67c8fe44de9" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "b671e3e1-8670-e0d5-2eff-d67c8fe44de9" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:41:45 +0000 (0:00:00.399) 0:00:50.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003312", "end": "2022-06-01 12:41:44.946690", "rc": 0, "start": "2022-06-01 12:41:44.943378" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:41:45 +0000 (0:00:00.401) 0:00:50.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002577", "end": "2022-06-01 12:41:45.324388", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:41:45.321811" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:41:45 +0000 (0:00:00.377) 0:00:50.781 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:41:45 +0000 (0:00:00.076) 0:00:50.857 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.072) 0:00:50.930 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.066) 0:00:50.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.041) 0:00:51.038 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.389) 0:00:51.427 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.044) 0:00:51.472 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.039) 0:00:51.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.042) 0:00:51.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.038) 0:00:51.592 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid0" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.039) 0:00:51.632 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.046) 0:00:51.678 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:41:46 +0000 (0:00:00.059) 0:00:51.737 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.007752", "end": "2022-06-01 12:41:46.685324", "rc": 0, "start": "2022-06-01 12:41:46.677572" } STDOUT: /dev/md/vg1-1: Version : 1.2 Creation Time : Wed Jun 1 12:41:08 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:41:08 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : vg1-1 UUID : b671e3e1:8670e0d5:2effd67c:8fe44de9 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.403) 0:00:52.141 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.041) 0:00:52.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 0\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.039) 0:00:52.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.2\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.040) 0:00:52.262 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.041) 0:00:52.304 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.041) 0:00:52.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.044) 0:00:52.390 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.031) 0:00:52.422 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.071) 0:00:52.493 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.094) 0:00:52.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.034) 0:00:52.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.033) 0:00:52.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.032) 0:00:52.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.033) 0:00:52.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.037) 0:00:52.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.037) 0:00:52.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.035) 0:00:52.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:41:47 +0000 (0:00:00.032) 0:00:52.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.031) 0:00:52.896 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.068) 0:00:52.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.039) 0:00:53.004 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.037) 0:00:53.042 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.062) 0:00:53.104 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.037) 0:00:53.141 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.037) 0:00:53.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.031) 0:00:53.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.031) 0:00:53.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.031) 0:00:53.274 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.034) 0:00:53.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.084) 0:00:53.393 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.064) 0:00:53.458 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.099) 0:00:53.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.033) 0:00:53.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.030) 0:00:53.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.030) 0:00:53.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.034) 0:00:53.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.031) 0:00:53.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.029) 0:00:53.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.029) 0:00:53.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.030) 0:00:53.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.029) 0:00:53.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:41:48 +0000 (0:00:00.034) 0:00:53.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.033) 0:00:53.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.032) 0:00:53.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.031) 0:00:53.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.031) 0:00:54.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.033) 0:00:54.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.036) 0:00:54.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.031) 0:00:54.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.030) 0:00:54.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.031) 0:00:54.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.032) 0:00:54.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.031) 0:00:54.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.035) 0:00:54.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.032) 0:00:54.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.032) 0:00:54.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.032) 0:00:54.365 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.083) 0:00:54.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.036) 0:00:54.485 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.121) 0:00:54.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.036) 0:00:54.643 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "31f9f766-4693-48e3-b76b-c1ae94232bb1" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "31f9f766-4693-48e3-b76b-c1ae94232bb1" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.044) 0:00:54.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.039) 0:00:54.727 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.038) 0:00:54.765 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.039) 0:00:54.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.031) 0:00:54.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:41:49 +0000 (0:00:00.032) 0:00:54.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.034) 0:00:54.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.034) 0:00:54.938 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.048) 0:00:54.987 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.036) 0:00:55.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.037) 0:00:55.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.031) 0:00:55.092 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.034) 0:00:55.126 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.037) 0:00:55.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.038) 0:00:55.202 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101675.1461215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101675.1461215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5852, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101675.1461215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.386) 0:00:55.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.038) 0:00:55.628 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.039) 0:00:55.667 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.034) 0:00:55.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.033) 0:00:55.735 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.037) 0:00:55.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.031) 0:00:55.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.032) 0:00:55.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:41:50 +0000 (0:00:00.030) 0:00:55.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.036) 0:00:55.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.032) 0:00:55.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.029) 0:00:55.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.029) 0:00:55.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.031) 0:00:56.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.030) 0:00:56.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.038) 0:00:56.096 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.037) 0:00:56.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.032) 0:00:56.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.031) 0:00:56.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.030) 0:00:56.228 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.032) 0:00:56.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.030) 0:00:56.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.033) 0:00:56.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.032) 0:00:56.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.031) 0:00:56.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.032) 0:00:56.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.031) 0:00:56.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.033) 0:00:56.485 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:41:51 +0000 (0:00:00.365) 0:00:56.850 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.396) 0:00:57.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.039) 0:00:57.287 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.035) 0:00:57.322 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.033) 0:00:57.355 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.034) 0:00:57.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.039) 0:00:57.429 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.033) 0:00:57.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.032) 0:00:57.495 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.040) 0:00:57.536 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.036) 0:00:57.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:41:52 +0000 (0:00:00.042) 0:00:57.615 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.038015", "end": "2022-06-01 12:41:52.582814", "rc": 0, "start": "2022-06-01 12:41:52.544799" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.431) 0:00:58.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.048) 0:00:58.095 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.043) 0:00:58.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.033) 0:00:58.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.032) 0:00:58.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.033) 0:00:58.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.033) 0:00:58.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.031) 0:00:58.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.040) 0:00:58.344 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.172) 0:00:58.517 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.037) 0:00:58.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "2219ffac-5e06-411e-8fb5-1afc9a85e0f4" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "2219ffac-5e06-411e-8fb5-1afc9a85e0f4" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.043) 0:00:58.598 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.037) 0:00:58.636 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.035) 0:00:58.671 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.040) 0:00:58.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.031) 0:00:58.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.031) 0:00:58.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.030) 0:00:58.805 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:41:53 +0000 (0:00:00.032) 0:00:58.838 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.046) 0:00:58.884 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.038) 0:00:58.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.038) 0:00:58.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.032) 0:00:58.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.031) 0:00:59.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.039) 0:00:59.065 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.040) 0:00:59.106 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101674.8931215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101674.8931215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5818, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101674.8931215, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.376) 0:00:59.482 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.037) 0:00:59.520 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.037) 0:00:59.557 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.034) 0:00:59.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.030) 0:00:59.622 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.036) 0:00:59.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.031) 0:00:59.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.033) 0:00:59.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.031) 0:00:59.755 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.038) 0:00:59.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.030) 0:00:59.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:41:54 +0000 (0:00:00.031) 0:00:59.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.035) 0:00:59.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.038) 0:00:59.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.032) 0:00:59.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.041) 0:01:00.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.036) 0:01:00.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.031) 0:01:00.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.034) 0:01:00.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.035) 0:01:00.141 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.034) 0:01:00.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.033) 0:01:00.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.033) 0:01:00.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.032) 0:01:00.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.034) 0:01:00.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.033) 0:01:00.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.034) 0:01:00.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.032) 0:01:00.411 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:41:55 +0000 (0:00:00.387) 0:01:00.798 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.371) 0:01:01.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.040) 0:01:01.210 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.036) 0:01:01.247 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.032) 0:01:01.280 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.031) 0:01:01.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.034) 0:01:01.345 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.032) 0:01:01.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.031) 0:01:01.409 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.037) 0:01:01.446 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.033) 0:01:01.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:41:56 +0000 (0:00:00.045) 0:01:01.526 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.031201", "end": "2022-06-01 12:41:56.470488", "rc": 0, "start": "2022-06-01 12:41:56.439287" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.400) 0:01:01.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.040) 0:01:01.968 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.041) 0:01:02.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.033) 0:01:02.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.032) 0:01:02.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.032) 0:01:02.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.034) 0:01:02.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.031) 0:01:02.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.036) 0:01:02.210 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.118) 0:01:02.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.036) 0:01:02.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "fa7ef1a8-d4d3-42dd-bb22-536ff9010f7f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "fa7ef1a8-d4d3-42dd-bb22-536ff9010f7f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.042) 0:01:02.408 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.039) 0:01:02.448 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.037) 0:01:02.485 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.039) 0:01:02.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.031) 0:01:02.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.031) 0:01:02.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.030) 0:01:02.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.034) 0:01:02.653 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.048) 0:01:02.701 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.036) 0:01:02.738 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.045) 0:01:02.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.031) 0:01:02.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:41:57 +0000 (0:00:00.036) 0:01:02.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.039) 0:01:02.890 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.036) 0:01:02.927 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101674.6421216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101674.6421216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5783, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101674.6421216, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.381) 0:01:03.308 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.037) 0:01:03.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.044) 0:01:03.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.039) 0:01:03.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.036) 0:01:03.467 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.038) 0:01:03.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.031) 0:01:03.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.031) 0:01:03.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.031) 0:01:03.599 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.038) 0:01:03.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.034) 0:01:03.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.087) 0:01:03.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.033) 0:01:03.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.033) 0:01:03.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:41:58 +0000 (0:00:00.034) 0:01:03.862 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.041) 0:01:03.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.039) 0:01:03.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.032) 0:01:03.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.031) 0:01:04.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.030) 0:01:04.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.032) 0:01:04.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.032) 0:01:04.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.034) 0:01:04.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.031) 0:01:04.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.032) 0:01:04.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.031) 0:01:04.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.031) 0:01:04.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.032) 0:01:04.296 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:41:59 +0000 (0:00:00.361) 0:01:04.657 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.384) 0:01:05.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.039) 0:01:05.081 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.036) 0:01:05.118 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.031) 0:01:05.149 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.032) 0:01:05.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.032) 0:01:05.214 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.031) 0:01:05.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.031) 0:01:05.277 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.038) 0:01:05.315 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.033) 0:01:05.349 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.044) 0:01:05.393 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.032840", "end": "2022-06-01 12:42:00.349615", "rc": 0, "start": "2022-06-01 12:42:00.316775" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.411) 0:01:05.804 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:42:00 +0000 (0:00:00.040) 0:01:05.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.041) 0:01:05.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.037) 0:01:05.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.035) 0:01:05.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.034) 0:01:05.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.034) 0:01:06.029 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.032) 0:01:06.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.030) 0:01:06.091 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.030) 0:01:06.122 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the device created above] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:77 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.029) 0:01:06.152 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.076) 0:01:06.228 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.046) 0:01:06.275 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:42:01 +0000 (0:00:00.542) 0:01:06.817 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.075) 0:01:06.893 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.032) 0:01:06.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.033) 0:01:06.959 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.062) 0:01:07.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.025) 0:01:07.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.031) 0:01:07.079 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "raid0", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.045) 0:01:07.124 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.041) 0:01:07.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.034) 0:01:07.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.033) 0:01:07.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.032) 0:01:07.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.035) 0:01:07.302 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.047) 0:01:07.350 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:42:02 +0000 (0:00:00.030) 0:01:07.381 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:42:06 +0000 (0:00:03.841) 0:01:11.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:42:06 +0000 (0:00:00.036) 0:01:11.259 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:42:06 +0000 (0:00:00.030) 0:01:11.289 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:42:06 +0000 (0:00:00.051) 0:01:11.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:42:06 +0000 (0:00:00.045) 0:01:11.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:42:06 +0000 (0:00:00.037) 0:01:11.424 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:42:07 +0000 (0:00:01.107) 0:01:12.531 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:42:08 +0000 (0:00:00.668) 0:01:13.199 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:42:08 +0000 (0:00:00.032) 0:01:13.231 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:42:09 +0000 (0:00:00.675) 0:01:13.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:42:09 +0000 (0:00:00.398) 0:01:14.306 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:42:09 +0000 (0:00:00.033) 0:01:14.339 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:98 Wednesday 01 June 2022 16:42:10 +0000 (0:00:00.809) 0:01:15.149 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:42:10 +0000 (0:00:00.061) 0:01:15.211 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:42:10 +0000 (0:00:00.045) 0:01:15.256 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:42:10 +0000 (0:00:00.029) 0:01:15.286 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:42:10 +0000 (0:00:00.390) 0:01:15.676 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002826", "end": "2022-06-01 12:42:10.603752", "rc": 0, "start": "2022-06-01 12:42:10.600926" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.382) 0:01:16.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002621", "end": "2022-06-01 12:42:10.975384", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:42:10.972763" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.370) 0:01:16.430 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.074) 0:01:16.504 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.030) 0:01:16.535 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.064) 0:01:16.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.041) 0:01:16.641 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.028) 0:01:16.670 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.028) 0:01:16.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.037) 0:01:16.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.038) 0:01:16.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.036) 0:01:16.811 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid0" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:42:11 +0000 (0:00:00.038) 0:01:16.849 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.029) 0:01:16.878 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.057) 0:01:16.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.031) 0:01:16.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.030) 0:01:16.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.030) 0:01:17.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.029) 0:01:17.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.029) 0:01:17.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.031) 0:01:17.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.029) 0:01:17.148 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.029) 0:01:17.178 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.055) 0:01:17.234 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.128) 0:01:17.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.032) 0:01:17.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.035) 0:01:17.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.040) 0:01:17.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.038) 0:01:17.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.038) 0:01:17.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.031) 0:01:17.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.029) 0:01:17.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.031) 0:01:17.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.029) 0:01:17.670 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.062) 0:01:17.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.043) 0:01:17.776 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.028) 0:01:17.805 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.026) 0:01:17.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:42:12 +0000 (0:00:00.030) 0:01:17.862 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.062) 0:01:17.925 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.099) 0:01:18.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.032) 0:01:18.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.030) 0:01:18.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.031) 0:01:18.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.028) 0:01:18.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.030) 0:01:18.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.031) 0:01:18.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.031) 0:01:18.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.031) 0:01:18.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.032) 0:01:18.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.030) 0:01:18.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.028) 0:01:18.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.028) 0:01:18.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.028) 0:01:18.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.030) 0:01:18.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.031) 0:01:18.748 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.029) 0:01:18.777 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:42:13 +0000 (0:00:00.080) 0:01:18.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.035) 0:01:18.894 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.121) 0:01:19.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.035) 0:01:19.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.040) 0:01:19.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.029) 0:01:19.120 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.035) 0:01:19.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.029) 0:01:19.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.031) 0:01:19.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.080) 0:01:19.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.031) 0:01:19.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.031) 0:01:19.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.045) 0:01:19.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.024) 0:01:19.430 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.034) 0:01:19.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.030) 0:01:19.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.029) 0:01:19.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.029) 0:01:19.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:42:14 +0000 (0:00:00.025) 0:01:19.580 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.359) 0:01:19.940 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.038) 0:01:19.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.024) 0:01:20.003 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.035) 0:01:20.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.030) 0:01:20.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.026) 0:01:20.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.037) 0:01:20.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.032) 0:01:20.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.029) 0:01:20.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.026) 0:01:20.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.029) 0:01:20.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.029) 0:01:20.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.028) 0:01:20.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.027) 0:01:20.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.027) 0:01:20.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.036) 0:01:20.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.036) 0:01:20.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.031) 0:01:20.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.031) 0:01:20.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.030) 0:01:20.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.031) 0:01:20.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.032) 0:01:20.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.029) 0:01:20.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.029) 0:01:20.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.030) 0:01:20.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.030) 0:01:20.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.031) 0:01:20.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.035) 0:01:20.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.031) 0:01:20.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.028) 0:01:20.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:42:15 +0000 (0:00:00.028) 0:01:20.871 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.030) 0:01:20.901 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.031) 0:01:20.933 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.042) 0:01:20.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.030) 0:01:21.007 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.028) 0:01:21.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.028) 0:01:21.064 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.032) 0:01:21.097 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.034) 0:01:21.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.032) 0:01:21.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.030) 0:01:21.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.030) 0:01:21.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.030) 0:01:21.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.029) 0:01:21.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.030) 0:01:21.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.034) 0:01:21.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.030) 0:01:21.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.031) 0:01:21.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.038) 0:01:21.451 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.167) 0:01:21.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.036) 0:01:21.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.041) 0:01:21.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.031) 0:01:21.728 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.036) 0:01:21.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.028) 0:01:21.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.028) 0:01:21.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:42:16 +0000 (0:00:00.028) 0:01:21.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.029) 0:01:21.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.032) 0:01:21.912 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.043) 0:01:21.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.024) 0:01:21.980 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.032) 0:01:22.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.025) 0:01:22.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.028) 0:01:22.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.029) 0:01:22.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.022) 0:01:22.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.357) 0:01:22.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.037) 0:01:22.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.026) 0:01:22.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.033) 0:01:22.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.030) 0:01:22.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.026) 0:01:22.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.031) 0:01:22.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.029) 0:01:22.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.030) 0:01:22.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.025) 0:01:22.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.036) 0:01:22.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.028) 0:01:22.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:42:17 +0000 (0:00:00.033) 0:01:22.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:22.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:22.906 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.036) 0:01:22.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.036) 0:01:22.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.071 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.029) 0:01:23.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.029) 0:01:23.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.032) 0:01:23.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.029) 0:01:23.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.029) 0:01:23.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.034) 0:01:23.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.031) 0:01:23.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.031) 0:01:23.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.032) 0:01:23.413 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.033) 0:01:23.446 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.477 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.031) 0:01:23.539 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.031) 0:01:23.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.032) 0:01:23.603 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.033) 0:01:23.637 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.031) 0:01:23.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.029) 0:01:23.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.030) 0:01:23.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.029) 0:01:23.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.066) 0:01:23.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:42:18 +0000 (0:00:00.031) 0:01:23.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.030) 0:01:23.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.029) 0:01:23.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.030) 0:01:23.945 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.029) 0:01:23.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.037) 0:01:24.011 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.114) 0:01:24.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.034) 0:01:24.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.038) 0:01:24.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.029) 0:01:24.228 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.033) 0:01:24.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.028) 0:01:24.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.031) 0:01:24.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.028) 0:01:24.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.029) 0:01:24.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.030) 0:01:24.411 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.045) 0:01:24.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.024) 0:01:24.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.036) 0:01:24.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.029) 0:01:24.547 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.030) 0:01:24.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.029) 0:01:24.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:42:19 +0000 (0:00:00.024) 0:01:24.631 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.373) 0:01:25.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.037) 0:01:25.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.026) 0:01:25.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.033) 0:01:25.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.028) 0:01:25.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.024) 0:01:25.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.028) 0:01:25.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.028) 0:01:25.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.028) 0:01:25.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.027) 0:01:25.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.029) 0:01:25.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.028) 0:01:25.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.028) 0:01:25.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.029) 0:01:25.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.036) 0:01:25.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.047) 0:01:25.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.036) 0:01:25.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.032) 0:01:25.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.031) 0:01:25.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.030) 0:01:25.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.031) 0:01:25.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.032) 0:01:25.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.030) 0:01:25.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.030) 0:01:25.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.030) 0:01:25.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.029) 0:01:25.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.029) 0:01:25.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:42:20 +0000 (0:00:00.032) 0:01:25.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:25.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:25.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.031) 0:01:25.940 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.033) 0:01:25.974 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:26.004 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.076) 0:01:26.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:26.111 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.029) 0:01:26.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.031) 0:01:26.171 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.032) 0:01:26.204 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.031) 0:01:26.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.031) 0:01:26.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.029) 0:01:26.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.029) 0:01:26.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:26.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:26.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:26.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.033) 0:01:26.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.032) 0:01:26.483 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.031) 0:01:26.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:26.545 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.029) 0:01:26.575 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create a RAID1 lvm raid device] ****************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:100 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.030) 0:01:26.605 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.085) 0:01:26.690 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:42:21 +0000 (0:00:00.042) 0:01:26.733 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.522) 0:01:27.255 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.071) 0:01:27.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.030) 0:01:27.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.028) 0:01:27.386 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.071) 0:01:27.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.026) 0:01:27.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.030) 0:01:27.514 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.036) 0:01:27.551 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.035) 0:01:27.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.030) 0:01:27.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.029) 0:01:27.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.029) 0:01:27.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.029) 0:01:27.705 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.043) 0:01:27.749 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:42:22 +0000 (0:00:00.036) 0:01:27.785 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:42:25 +0000 (0:00:02.913) 0:01:30.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:42:25 +0000 (0:00:00.032) 0:01:30.732 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:42:25 +0000 (0:00:00.031) 0:01:30.763 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:42:25 +0000 (0:00:00.106) 0:01:30.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:42:26 +0000 (0:00:00.038) 0:01:30.908 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:42:26 +0000 (0:00:00.035) 0:01:30.944 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:42:26 +0000 (0:00:00.028) 0:01:30.973 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:42:26 +0000 (0:00:00.662) 0:01:31.635 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:42:27 +0000 (0:00:00.415) 0:01:32.050 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:42:27 +0000 (0:00:00.652) 0:01:32.702 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:42:28 +0000 (0:00:00.375) 0:01:33.078 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:42:28 +0000 (0:00:00.028) 0:01:33.107 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:116 Wednesday 01 June 2022 16:42:29 +0000 (0:00:00.853) 0:01:33.961 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:42:29 +0000 (0:00:00.067) 0:01:34.029 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:42:29 +0000 (0:00:00.041) 0:01:34.070 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:42:29 +0000 (0:00:00.030) 0:01:34.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "725fa0d8-8d27-4d80-a121-d5458f04def1" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_0", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_1", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "WUkFBd-bvcu-E8OB-Ebn9-Tkig-9x80-61rxdM" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "QfRckV-3znD-cR2T-50Bb-2DJT-wzUj-VbbZBk" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:42:29 +0000 (0:00:00.379) 0:01:34.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002593", "end": "2022-06-01 12:42:29.391725", "rc": 0, "start": "2022-06-01 12:42:29.389132" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:42:29 +0000 (0:00:00.368) 0:01:34.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002394", "end": "2022-06-01 12:42:29.765179", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:42:29.762785" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:42:30 +0000 (0:00:00.370) 0:01:35.219 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:42:30 +0000 (0:00:00.067) 0:01:35.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:42:30 +0000 (0:00:00.032) 0:01:35.319 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:42:30 +0000 (0:00:00.067) 0:01:35.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:42:30 +0000 (0:00:00.040) 0:01:35.427 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.741) 0:01:36.169 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.053) 0:01:36.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.039) 0:01:36.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.036) 0:01:36.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.040) 0:01:36.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.032) 0:01:36.371 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.054) 0:01:36.426 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.057) 0:01:36.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.032) 0:01:36.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.032) 0:01:36.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.033) 0:01:36.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.033) 0:01:36.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.030) 0:01:36.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.029) 0:01:36.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.031) 0:01:36.709 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.032) 0:01:36.742 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.058) 0:01:36.800 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:42:31 +0000 (0:00:00.066) 0:01:36.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid1", "vg1" ], "delta": "0:00:00.032976", "end": "2022-06-01 12:42:31.811502", "rc": 0, "start": "2022-06-01 12:42:31.778526" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.402) 0:01:37.269 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.043) 0:01:37.313 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.039) 0:01:37.352 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.070) 0:01:37.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.037) 0:01:37.460 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.042) 0:01:37.503 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.066) 0:01:37.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.038) 0:01:37.608 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.036) 0:01:37.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.030) 0:01:37.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.029) 0:01:37.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.031) 0:01:37.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.040) 0:01:37.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.043) 0:01:37.821 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:42:32 +0000 (0:00:00.038) 0:01:37.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.031) 0:01:37.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.031) 0:01:37.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.032) 0:01:37.955 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.031) 0:01:37.986 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.037) 0:01:38.024 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.066) 0:01:38.090 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.111) 0:01:38.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.033) 0:01:38.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.036) 0:01:38.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.039) 0:01:38.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.033) 0:01:38.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.032) 0:01:38.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.035) 0:01:38.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.032) 0:01:38.444 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.032) 0:01:38.476 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.031) 0:01:38.508 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.069) 0:01:38.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.040) 0:01:38.618 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.121) 0:01:38.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.038) 0:01:38.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 508813, "block_size": 4096, "block_total": 520704, "block_used": 11891, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2084098048, "size_total": 2132803584, "uuid": "725fa0d8-8d27-4d80-a121-d5458f04def1" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 508813, "block_size": 4096, "block_total": 520704, "block_used": 11891, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2084098048, "size_total": 2132803584, "uuid": "725fa0d8-8d27-4d80-a121-d5458f04def1" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.046) 0:01:38.825 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:42:33 +0000 (0:00:00.043) 0:01:38.869 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.036) 0:01:38.905 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.037) 0:01:38.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.032) 0:01:38.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.031) 0:01:39.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.031) 0:01:39.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.034) 0:01:39.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.047) 0:01:39.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.035) 0:01:39.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.039) 0:01:39.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.033) 0:01:39.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.036) 0:01:39.264 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.038) 0:01:39.303 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.038) 0:01:39.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101745.1761215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101745.1761215, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 6292, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101745.1761215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.394) 0:01:39.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.040) 0:01:39.776 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.038) 0:01:39.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:42:34 +0000 (0:00:00.035) 0:01:39.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.034) 0:01:39.884 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.035) 0:01:39.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.032) 0:01:39.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.031) 0:01:39.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.030) 0:01:40.014 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.040) 0:01:40.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.036) 0:01:40.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.032) 0:01:40.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.031) 0:01:40.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.033) 0:01:40.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.033) 0:01:40.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.039) 0:01:40.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.043) 0:01:40.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.032) 0:01:40.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.032) 0:01:40.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.039) 0:01:40.409 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.033) 0:01:40.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.033) 0:01:40.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.036) 0:01:40.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.097) 0:01:40.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.036) 0:01:40.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.035) 0:01:40.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.034) 0:01:40.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:42:35 +0000 (0:00:00.033) 0:01:40.750 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.380) 0:01:41.131 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.391) 0:01:41.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.039) 0:01:41.562 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.037) 0:01:41.599 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.033) 0:01:41.632 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.033) 0:01:41.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.033) 0:01:41.700 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.033) 0:01:41.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.032) 0:01:41.765 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.036) 0:01:41.802 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:42:36 +0000 (0:00:00.034) 0:01:41.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.042) 0:01:41.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.037733", "end": "2022-06-01 12:42:36.830707", "rc": 0, "start": "2022-06-01 12:42:36.792974" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.407) 0:01:42.286 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid1" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.041) 0:01:42.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.041) 0:01:42.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.035) 0:01:42.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.033) 0:01:42.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.034) 0:01:42.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.036) 0:01:42.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.037) 0:01:42.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.032) 0:01:42.578 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.032) 0:01:42.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:118 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.032) 0:01:42.643 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.091) 0:01:42.735 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:42:37 +0000 (0:00:00.055) 0:01:42.791 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.520) 0:01:43.311 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.077) 0:01:43.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.033) 0:01:43.423 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.031) 0:01:43.454 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.064) 0:01:43.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.075) 0:01:43.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.032) 0:01:43.627 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.041) 0:01:43.668 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.034) 0:01:43.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.032) 0:01:43.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.037) 0:01:43.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.033) 0:01:43.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:42:38 +0000 (0:00:00.032) 0:01:43.840 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:42:39 +0000 (0:00:00.047) 0:01:43.887 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:42:39 +0000 (0:00:00.028) 0:01:43.916 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:42:41 +0000 (0:00:02.507) 0:01:46.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:42:41 +0000 (0:00:00.032) 0:01:46.455 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:42:41 +0000 (0:00:00.030) 0:01:46.486 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:42:41 +0000 (0:00:00.044) 0:01:46.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:42:41 +0000 (0:00:00.040) 0:01:46.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:42:41 +0000 (0:00:00.036) 0:01:46.606 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:42:41 +0000 (0:00:00.031) 0:01:46.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:42:42 +0000 (0:00:00.689) 0:01:47.328 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:42:42 +0000 (0:00:00.409) 0:01:47.738 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:42:43 +0000 (0:00:00.685) 0:01:48.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:42:43 +0000 (0:00:00.379) 0:01:48.804 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:42:43 +0000 (0:00:00.030) 0:01:48.834 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:134 Wednesday 01 June 2022 16:42:44 +0000 (0:00:00.891) 0:01:49.726 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:42:44 +0000 (0:00:00.070) 0:01:49.796 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:42:44 +0000 (0:00:00.043) 0:01:49.839 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:42:44 +0000 (0:00:00.033) 0:01:49.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "725fa0d8-8d27-4d80-a121-d5458f04def1" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_0", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_1", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "WUkFBd-bvcu-E8OB-Ebn9-Tkig-9x80-61rxdM" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "QfRckV-3znD-cR2T-50Bb-2DJT-wzUj-VbbZBk" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:42:45 +0000 (0:00:00.399) 0:01:50.272 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002650", "end": "2022-06-01 12:42:45.174810", "rc": 0, "start": "2022-06-01 12:42:45.172160" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:42:45 +0000 (0:00:00.361) 0:01:50.633 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002337", "end": "2022-06-01 12:42:45.553506", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:42:45.551169" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:42:46 +0000 (0:00:00.375) 0:01:51.009 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:42:46 +0000 (0:00:00.063) 0:01:51.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:42:46 +0000 (0:00:00.032) 0:01:51.105 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:42:46 +0000 (0:00:00.081) 0:01:51.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:42:46 +0000 (0:00:00.049) 0:01:51.237 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.713) 0:01:51.950 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.051) 0:01:52.002 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.043) 0:01:52.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.037) 0:01:52.084 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.037) 0:01:52.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.032) 0:01:52.154 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.052) 0:01:52.207 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.060) 0:01:52.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.032) 0:01:52.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.030) 0:01:52.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.029) 0:01:52.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.029) 0:01:52.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.030) 0:01:52.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.028) 0:01:52.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.029) 0:01:52.478 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.030) 0:01:52.509 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.062) 0:01:52.571 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:42:47 +0000 (0:00:00.063) 0:01:52.634 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid1", "vg1" ], "delta": "0:00:00.034449", "end": "2022-06-01 12:42:47.610599", "rc": 0, "start": "2022-06-01 12:42:47.576150" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.432) 0:01:53.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.041) 0:01:53.109 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.041) 0:01:53.150 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.112) 0:01:53.263 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.036) 0:01:53.300 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.041) 0:01:53.341 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.070) 0:01:53.411 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.035) 0:01:53.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.036) 0:01:53.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.030) 0:01:53.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.032) 0:01:53.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.030) 0:01:53.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.031) 0:01:53.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.034) 0:01:53.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.035) 0:01:53.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.031) 0:01:53.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.034) 0:01:53.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.032) 0:01:53.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.032) 0:01:53.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:42:48 +0000 (0:00:00.030) 0:01:53.843 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.066) 0:01:53.909 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.066) 0:01:53.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.032) 0:01:54.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.035) 0:01:54.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.038) 0:01:54.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.033) 0:01:54.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.032) 0:01:54.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.032) 0:01:54.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.032) 0:01:54.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.033) 0:01:54.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.037) 0:01:54.285 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.064) 0:01:54.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.037) 0:01:54.387 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.119) 0:01:54.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.037) 0:01:54.544 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509814, "block_size": 4096, "block_total": 521728, "block_used": 11914, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088198144, "size_total": 2136997888, "uuid": "725fa0d8-8d27-4d80-a121-d5458f04def1" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509814, "block_size": 4096, "block_total": 521728, "block_used": 11914, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088198144, "size_total": 2136997888, "uuid": "725fa0d8-8d27-4d80-a121-d5458f04def1" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.044) 0:01:54.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.037) 0:01:54.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.035) 0:01:54.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.038) 0:01:54.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.034) 0:01:54.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.032) 0:01:54.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.032) 0:01:54.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:42:49 +0000 (0:00:00.033) 0:01:54.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.049) 0:01:54.883 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.037) 0:01:54.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.038) 0:01:54.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.031) 0:01:54.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.032) 0:01:55.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.039) 0:01:55.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.038) 0:01:55.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101760.9101214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101760.9101214, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 6292, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101760.9101214, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.381) 0:01:55.483 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.038) 0:01:55.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.039) 0:01:55.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.037) 0:01:55.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.032) 0:01:55.632 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.039) 0:01:55.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.031) 0:01:55.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.032) 0:01:55.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.031) 0:01:55.768 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.041) 0:01:55.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.032) 0:01:55.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:42:50 +0000 (0:00:00.031) 0:01:55.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.031) 0:01:55.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.032) 0:01:55.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.032) 0:01:55.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.041) 0:01:56.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.036) 0:01:56.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.030) 0:01:56.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.030) 0:01:56.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.030) 0:01:56.138 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.031) 0:01:56.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.032) 0:01:56.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.033) 0:01:56.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.031) 0:01:56.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.031) 0:01:56.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.032) 0:01:56.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.034) 0:01:56.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.038) 0:01:56.403 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:42:51 +0000 (0:00:00.393) 0:01:56.796 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.380) 0:01:57.177 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.040) 0:01:57.218 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.035) 0:01:57.254 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.032) 0:01:57.286 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.035) 0:01:57.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.035) 0:01:57.357 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.038) 0:01:57.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.041) 0:01:57.438 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.039) 0:01:57.477 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.037) 0:01:57.515 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:42:52 +0000 (0:00:00.042) 0:01:57.558 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.038623", "end": "2022-06-01 12:42:52.520771", "rc": 0, "start": "2022-06-01 12:42:52.482148" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.421) 0:01:57.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid1" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.039) 0:01:58.018 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.041) 0:01:58.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.033) 0:01:58.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.033) 0:01:58.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.033) 0:01:58.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.033) 0:01:58.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.034) 0:01:58.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.031) 0:01:58.261 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.029) 0:01:58.290 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the device created above] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:136 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.030) 0:01:58.321 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.141) 0:01:58.463 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:42:53 +0000 (0:00:00.045) 0:01:58.508 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.528) 0:01:59.037 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.074) 0:01:59.111 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.035) 0:01:59.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.032) 0:01:59.179 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.071) 0:01:59.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.028) 0:01:59.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.036) 0:01:59.316 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.040) 0:01:59.356 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.047) 0:01:59.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.032) 0:01:59.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.031) 0:01:59.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.039) 0:01:59.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.033) 0:01:59.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.053) 0:01:59.595 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:42:54 +0000 (0:00:00.030) 0:01:59.626 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:42:57 +0000 (0:00:02.926) 0:02:02.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:42:57 +0000 (0:00:00.033) 0:02:02.586 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:42:57 +0000 (0:00:00.029) 0:02:02.616 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:42:57 +0000 (0:00:00.042) 0:02:02.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:42:57 +0000 (0:00:00.038) 0:02:02.697 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:42:57 +0000 (0:00:00.038) 0:02:02.735 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:42:58 +0000 (0:00:00.396) 0:02:03.131 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:42:58 +0000 (0:00:00.686) 0:02:03.817 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:42:58 +0000 (0:00:00.031) 0:02:03.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:42:59 +0000 (0:00:00.647) 0:02:04.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:42:59 +0000 (0:00:00.364) 0:02:04.861 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:43:00 +0000 (0:00:00.032) 0:02:04.893 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:152 Wednesday 01 June 2022 16:43:00 +0000 (0:00:00.921) 0:02:05.814 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:43:01 +0000 (0:00:00.071) 0:02:05.886 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:43:01 +0000 (0:00:00.036) 0:02:05.923 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:43:01 +0000 (0:00:00.027) 0:02:05.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:43:01 +0000 (0:00:00.373) 0:02:06.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003104", "end": "2022-06-01 12:43:01.247132", "rc": 0, "start": "2022-06-01 12:43:01.244028" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:43:01 +0000 (0:00:00.382) 0:02:06.706 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.005092", "end": "2022-06-01 12:43:01.642613", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:43:01.637521" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.398) 0:02:07.104 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.059) 0:02:07.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.028) 0:02:07.192 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.060) 0:02:07.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.039) 0:02:07.292 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.026) 0:02:07.319 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.027) 0:02:07.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.036) 0:02:07.383 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.038) 0:02:07.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.045) 0:02:07.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.030) 0:02:07.498 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.028) 0:02:07.526 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.057) 0:02:07.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.031) 0:02:07.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.029) 0:02:07.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.029) 0:02:07.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.029) 0:02:07.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.033) 0:02:07.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.037) 0:02:07.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.031) 0:02:07.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:43:02 +0000 (0:00:00.031) 0:02:07.838 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.059) 0:02:07.898 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.060) 0:02:07.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.032) 0:02:07.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.031) 0:02:08.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.031) 0:02:08.054 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.120) 0:02:08.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.036) 0:02:08.210 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.028) 0:02:08.239 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.028) 0:02:08.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.030) 0:02:08.297 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.059) 0:02:08.357 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.061) 0:02:08.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.029) 0:02:08.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.028) 0:02:08.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.029) 0:02:08.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.028) 0:02:08.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.029) 0:02:08.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.029) 0:02:08.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.031) 0:02:08.626 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.030) 0:02:08.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.029) 0:02:08.686 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.057) 0:02:08.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:43:03 +0000 (0:00:00.036) 0:02:08.780 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.122) 0:02:08.902 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.038) 0:02:08.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.039) 0:02:08.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.029) 0:02:09.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.034) 0:02:09.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.029) 0:02:09.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.029) 0:02:09.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.032) 0:02:09.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.030) 0:02:09.166 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.031) 0:02:09.198 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.046) 0:02:09.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.026) 0:02:09.271 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.035) 0:02:09.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.032) 0:02:09.339 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.032) 0:02:09.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.030) 0:02:09.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.026) 0:02:09.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.367) 0:02:09.796 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.037) 0:02:09.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:43:04 +0000 (0:00:00.025) 0:02:09.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.033) 0:02:09.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.031) 0:02:09.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.026) 0:02:09.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.029) 0:02:09.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.030) 0:02:10.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.028) 0:02:10.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.030) 0:02:10.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.033) 0:02:10.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.041) 0:02:10.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.030) 0:02:10.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.029) 0:02:10.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.028) 0:02:10.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.034) 0:02:10.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.093) 0:02:10.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.032) 0:02:10.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.031) 0:02:10.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.030) 0:02:10.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.031) 0:02:10.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.032) 0:02:10.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.033) 0:02:10.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.028) 0:02:10.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.028) 0:02:10.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.029) 0:02:10.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.035) 0:02:10.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.032) 0:02:10.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.034) 0:02:10.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.032) 0:02:10.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.030) 0:02:10.806 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.031) 0:02:10.837 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:43:05 +0000 (0:00:00.027) 0:02:10.865 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.027) 0:02:10.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.029) 0:02:10.922 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.027) 0:02:10.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.027) 0:02:10.977 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.030) 0:02:11.007 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.030) 0:02:11.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.030) 0:02:11.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.033) 0:02:11.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.030) 0:02:11.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.028) 0:02:11.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.027) 0:02:11.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.027) 0:02:11.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.027) 0:02:11.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.033) 0:02:11.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.030) 0:02:11.307 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.029) 0:02:11.337 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.026) 0:02:11.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create a RAID0 lvm raid device] ****************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:154 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.030) 0:02:11.394 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.106) 0:02:11.501 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:43:06 +0000 (0:00:00.048) 0:02:11.549 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.532) 0:02:12.081 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.067) 0:02:12.148 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.027) 0:02:12.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.027) 0:02:12.203 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.058) 0:02:12.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.024) 0:02:12.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.030) 0:02:12.317 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.039) 0:02:12.357 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.031) 0:02:12.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.028) 0:02:12.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.029) 0:02:12.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.029) 0:02:12.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.044) 0:02:12.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.046) 0:02:12.567 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:43:07 +0000 (0:00:00.074) 0:02:12.642 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:43:10 +0000 (0:00:02.905) 0:02:15.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:43:10 +0000 (0:00:00.030) 0:02:15.578 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:43:10 +0000 (0:00:00.029) 0:02:15.608 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:43:10 +0000 (0:00:00.050) 0:02:15.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:43:10 +0000 (0:00:00.044) 0:02:15.702 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:43:10 +0000 (0:00:00.036) 0:02:15.738 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:43:10 +0000 (0:00:00.032) 0:02:15.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:43:11 +0000 (0:00:00.642) 0:02:16.413 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:43:11 +0000 (0:00:00.419) 0:02:16.833 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:43:12 +0000 (0:00:00.643) 0:02:17.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:43:13 +0000 (0:00:00.406) 0:02:17.882 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:43:13 +0000 (0:00:00.028) 0:02:17.911 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:170 Wednesday 01 June 2022 16:43:13 +0000 (0:00:00.876) 0:02:18.788 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:43:13 +0000 (0:00:00.082) 0:02:18.870 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:43:14 +0000 (0:00:00.041) 0:02:18.912 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:43:14 +0000 (0:00:00.031) 0:02:18.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "5114aaeb-7fe7-4345-96a0-3f1045d51db6" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "5KXtK8-IHTs-zv3N-vVMr-pIz2-38Cw-DVRbhg" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "qH2Tql-ERt3-cS4U-UZ5l-Qs6I-dW8x-R8SMmx" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:43:14 +0000 (0:00:00.403) 0:02:19.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002472", "end": "2022-06-01 12:43:14.257252", "rc": 0, "start": "2022-06-01 12:43:14.254780" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:43:14 +0000 (0:00:00.367) 0:02:19.714 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003450", "end": "2022-06-01 12:43:14.652719", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:43:14.649269" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:43:15 +0000 (0:00:00.401) 0:02:20.115 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:43:15 +0000 (0:00:00.070) 0:02:20.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:43:15 +0000 (0:00:00.085) 0:02:20.271 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:43:15 +0000 (0:00:00.067) 0:02:20.338 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:43:15 +0000 (0:00:00.041) 0:02:20.380 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.725) 0:02:21.106 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.051) 0:02:21.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.039) 0:02:21.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.035) 0:02:21.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.036) 0:02:21.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.029) 0:02:21.298 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.054) 0:02:21.352 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.061) 0:02:21.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.032) 0:02:21.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.030) 0:02:21.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.037) 0:02:21.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.034) 0:02:21.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.031) 0:02:21.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.030) 0:02:21.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.031) 0:02:21.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.032) 0:02:21.674 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.062) 0:02:21.737 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:43:16 +0000 (0:00:00.060) 0:02:21.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid0", "vg1" ], "delta": "0:00:00.037633", "end": "2022-06-01 12:43:16.759635", "rc": 0, "start": "2022-06-01 12:43:16.722002" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.420) 0:02:22.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.044) 0:02:22.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.043) 0:02:22.305 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.061) 0:02:22.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.037) 0:02:22.405 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.042) 0:02:22.447 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.071) 0:02:22.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.037) 0:02:22.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.036) 0:02:22.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.031) 0:02:22.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.029) 0:02:22.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.033) 0:02:22.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.032) 0:02:22.718 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.041) 0:02:22.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.040) 0:02:22.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.033) 0:02:22.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:43:17 +0000 (0:00:00.031) 0:02:22.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.082) 0:02:22.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.033) 0:02:22.981 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.034) 0:02:23.015 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.065) 0:02:23.081 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.070) 0:02:23.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.032) 0:02:23.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.031) 0:02:23.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.030) 0:02:23.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.033) 0:02:23.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.030) 0:02:23.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.029) 0:02:23.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.031) 0:02:23.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.032) 0:02:23.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.030) 0:02:23.434 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.063) 0:02:23.498 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.033) 0:02:23.531 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.118) 0:02:23.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.035) 0:02:23.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "5114aaeb-7fe7-4345-96a0-3f1045d51db6" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "5114aaeb-7fe7-4345-96a0-3f1045d51db6" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.047) 0:02:23.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.038) 0:02:23.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.037) 0:02:23.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:43:18 +0000 (0:00:00.038) 0:02:23.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.031) 0:02:23.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.033) 0:02:23.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.031) 0:02:23.945 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.030) 0:02:23.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.046) 0:02:24.022 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.036) 0:02:24.058 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.037) 0:02:24.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.033) 0:02:24.129 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.031) 0:02:24.160 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.036) 0:02:24.197 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.036) 0:02:24.234 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101790.0041215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101790.0041215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 6757, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101790.0041215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.382) 0:02:24.616 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.038) 0:02:24.655 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.036) 0:02:24.691 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.035) 0:02:24.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.032) 0:02:24.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.036) 0:02:24.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.031) 0:02:24.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:43:19 +0000 (0:00:00.030) 0:02:24.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.030) 0:02:24.889 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.041) 0:02:24.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.030) 0:02:24.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.030) 0:02:24.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.031) 0:02:25.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.031) 0:02:25.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.031) 0:02:25.085 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.084) 0:02:25.169 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.036) 0:02:25.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.030) 0:02:25.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.030) 0:02:25.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.031) 0:02:25.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.031) 0:02:25.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.034) 0:02:25.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.033) 0:02:25.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.033) 0:02:25.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.032) 0:02:25.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.031) 0:02:25.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.032) 0:02:25.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:43:20 +0000 (0:00:00.032) 0:02:25.561 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.386) 0:02:25.947 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.371) 0:02:26.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.039) 0:02:26.358 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.035) 0:02:26.393 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.032) 0:02:26.425 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.031) 0:02:26.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.031) 0:02:26.489 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.033) 0:02:26.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.031) 0:02:26.554 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.035) 0:02:26.590 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.036) 0:02:26.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:43:21 +0000 (0:00:00.046) 0:02:26.673 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.035525", "end": "2022-06-01 12:43:21.626261", "rc": 0, "start": "2022-06-01 12:43:21.590736" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid0 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.416) 0:02:27.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid0" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.040) 0:02:27.130 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.040) 0:02:27.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.033) 0:02:27.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.033) 0:02:27.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.033) 0:02:27.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.033) 0:02:27.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.032) 0:02:27.336 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.034) 0:02:27.370 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.034) 0:02:27.404 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:172 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.037) 0:02:27.442 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.121) 0:02:27.564 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:43:22 +0000 (0:00:00.048) 0:02:27.612 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.612) 0:02:28.224 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.076) 0:02:28.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.032) 0:02:28.333 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.032) 0:02:28.365 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.067) 0:02:28.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.028) 0:02:28.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.032) 0:02:28.493 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.042) 0:02:28.535 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.037) 0:02:28.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.032) 0:02:28.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.031) 0:02:28.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.031) 0:02:28.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.031) 0:02:28.701 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.045) 0:02:28.746 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:43:23 +0000 (0:00:00.032) 0:02:28.778 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:43:25 +0000 (0:00:01.835) 0:02:30.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:43:25 +0000 (0:00:00.032) 0:02:30.647 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:43:25 +0000 (0:00:00.030) 0:02:30.678 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:43:25 +0000 (0:00:00.044) 0:02:30.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:43:25 +0000 (0:00:00.040) 0:02:30.764 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:43:25 +0000 (0:00:00.036) 0:02:30.800 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:43:25 +0000 (0:00:00.030) 0:02:30.831 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:43:26 +0000 (0:00:00.981) 0:02:31.812 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:43:27 +0000 (0:00:00.406) 0:02:32.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:43:28 +0000 (0:00:00.670) 0:02:32.889 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:43:28 +0000 (0:00:00.367) 0:02:33.256 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:43:28 +0000 (0:00:00.031) 0:02:33.287 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:188 Wednesday 01 June 2022 16:43:29 +0000 (0:00:00.851) 0:02:34.139 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:43:29 +0000 (0:00:00.124) 0:02:34.263 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:43:29 +0000 (0:00:00.045) 0:02:34.309 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:43:29 +0000 (0:00:00.043) 0:02:34.352 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "5114aaeb-7fe7-4345-96a0-3f1045d51db6" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "5KXtK8-IHTs-zv3N-vVMr-pIz2-38Cw-DVRbhg" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "qH2Tql-ERt3-cS4U-UZ5l-Qs6I-dW8x-R8SMmx" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:43:29 +0000 (0:00:00.380) 0:02:34.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002524", "end": "2022-06-01 12:43:29.657960", "rc": 0, "start": "2022-06-01 12:43:29.655436" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:43:30 +0000 (0:00:00.382) 0:02:35.115 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002687", "end": "2022-06-01 12:43:30.027722", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:43:30.025035" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:43:30 +0000 (0:00:00.371) 0:02:35.487 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:43:30 +0000 (0:00:00.073) 0:02:35.560 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:43:30 +0000 (0:00:00.034) 0:02:35.595 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:43:30 +0000 (0:00:00.064) 0:02:35.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:43:30 +0000 (0:00:00.041) 0:02:35.701 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.703) 0:02:36.404 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.052) 0:02:36.457 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.038) 0:02:36.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.040) 0:02:36.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.038) 0:02:36.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.031) 0:02:36.606 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.053) 0:02:36.660 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.058) 0:02:36.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.031) 0:02:36.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.031) 0:02:36.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.032) 0:02:36.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:43:31 +0000 (0:00:00.031) 0:02:36.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.032) 0:02:36.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.033) 0:02:36.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.030) 0:02:36.943 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.032) 0:02:36.975 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.063) 0:02:37.039 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.070) 0:02:37.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid0", "vg1" ], "delta": "0:00:00.039511", "end": "2022-06-01 12:43:32.088435", "rc": 0, "start": "2022-06-01 12:43:32.048924" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.441) 0:02:37.551 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.048) 0:02:37.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.043) 0:02:37.643 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.121) 0:02:37.765 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.040) 0:02:37.806 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:43:32 +0000 (0:00:00.043) 0:02:37.850 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.071) 0:02:37.921 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.037) 0:02:37.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.038) 0:02:37.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.031) 0:02:38.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.030) 0:02:38.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.028) 0:02:38.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.030) 0:02:38.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.033) 0:02:38.153 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.038) 0:02:38.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.030) 0:02:38.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.032) 0:02:38.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.031) 0:02:38.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.032) 0:02:38.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.034) 0:02:38.352 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.065) 0:02:38.418 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.062) 0:02:38.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.031) 0:02:38.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.031) 0:02:38.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.034) 0:02:38.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.032) 0:02:38.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.032) 0:02:38.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.031) 0:02:38.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.032) 0:02:38.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.032) 0:02:38.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.033) 0:02:38.774 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.062) 0:02:38.836 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:43:33 +0000 (0:00:00.037) 0:02:38.874 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.123) 0:02:38.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.039) 0:02:39.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "5114aaeb-7fe7-4345-96a0-3f1045d51db6" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "5114aaeb-7fe7-4345-96a0-3f1045d51db6" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.043) 0:02:39.080 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.043) 0:02:39.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.046) 0:02:39.170 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.041) 0:02:39.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.033) 0:02:39.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.035) 0:02:39.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.032) 0:02:39.313 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.033) 0:02:39.346 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.049) 0:02:39.396 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.035) 0:02:39.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.039) 0:02:39.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.032) 0:02:39.504 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.032) 0:02:39.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.039) 0:02:39.576 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:43:34 +0000 (0:00:00.039) 0:02:39.615 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101790.0041215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101790.0041215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 6757, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101790.0041215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.370) 0:02:39.985 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.038) 0:02:40.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.037) 0:02:40.062 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.037) 0:02:40.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.035) 0:02:40.135 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.038) 0:02:40.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.032) 0:02:40.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.033) 0:02:40.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.031) 0:02:40.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.039) 0:02:40.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.034) 0:02:40.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.031) 0:02:40.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.030) 0:02:40.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.032) 0:02:40.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.031) 0:02:40.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.039) 0:02:40.510 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.040) 0:02:40.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.032) 0:02:40.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.030) 0:02:40.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.029) 0:02:40.643 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.032) 0:02:40.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.032) 0:02:40.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.035) 0:02:40.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.033) 0:02:40.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.031) 0:02:40.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.030) 0:02:40.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:43:35 +0000 (0:00:00.034) 0:02:40.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.033) 0:02:40.908 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.364) 0:02:41.272 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.380) 0:02:41.653 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.040) 0:02:41.694 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.037) 0:02:41.731 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.032) 0:02:41.763 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.032) 0:02:41.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.032) 0:02:41.828 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:43:36 +0000 (0:00:00.032) 0:02:41.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.032) 0:02:41.893 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.039) 0:02:41.933 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.036) 0:02:41.969 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.041) 0:02:42.010 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.034320", "end": "2022-06-01 12:43:36.961779", "rc": 0, "start": "2022-06-01 12:43:36.927459" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid0 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.411) 0:02:42.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid0" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.040) 0:02:42.462 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.042) 0:02:42.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.034) 0:02:42.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.035) 0:02:42.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.033) 0:02:42.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.032) 0:02:42.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.031) 0:02:42.672 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.039) 0:02:42.711 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.033) 0:02:42.745 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the device created above] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:190 Wednesday 01 June 2022 16:43:37 +0000 (0:00:00.043) 0:02:42.788 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.185) 0:02:42.973 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.046) 0:02:43.020 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.510) 0:02:43.531 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.076) 0:02:43.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.033) 0:02:43.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.032) 0:02:43.673 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.064) 0:02:43.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.027) 0:02:43.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.032) 0:02:43.798 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:43:38 +0000 (0:00:00.042) 0:02:43.840 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:43:39 +0000 (0:00:00.035) 0:02:43.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:43:39 +0000 (0:00:00.035) 0:02:43.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:43:39 +0000 (0:00:00.031) 0:02:43.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:43:39 +0000 (0:00:00.031) 0:02:43.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:43:39 +0000 (0:00:00.031) 0:02:44.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:43:39 +0000 (0:00:00.046) 0:02:44.052 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:43:39 +0000 (0:00:00.028) 0:02:44.080 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:43:42 +0000 (0:00:02.919) 0:02:47.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:43:42 +0000 (0:00:00.033) 0:02:47.033 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:43:42 +0000 (0:00:00.029) 0:02:47.063 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:43:42 +0000 (0:00:00.045) 0:02:47.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:43:42 +0000 (0:00:00.040) 0:02:47.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:43:42 +0000 (0:00:00.035) 0:02:47.184 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:43:42 +0000 (0:00:00.397) 0:02:47.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:43:43 +0000 (0:00:00.657) 0:02:48.239 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:43:43 +0000 (0:00:00.034) 0:02:48.273 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:43:44 +0000 (0:00:00.672) 0:02:48.946 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:43:44 +0000 (0:00:00.361) 0:02:49.307 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:43:44 +0000 (0:00:00.031) 0:02:49.338 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:206 Wednesday 01 June 2022 16:43:45 +0000 (0:00:00.857) 0:02:50.195 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:43:45 +0000 (0:00:00.090) 0:02:50.286 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:43:45 +0000 (0:00:00.107) 0:02:50.394 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:43:45 +0000 (0:00:00.030) 0:02:50.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:43:45 +0000 (0:00:00.390) 0:02:50.815 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003000", "end": "2022-06-01 12:43:45.733647", "rc": 0, "start": "2022-06-01 12:43:45.730647" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.376) 0:02:51.191 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002445", "end": "2022-06-01 12:43:46.095638", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:43:46.093193" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.366) 0:02:51.557 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.065) 0:02:51.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.031) 0:02:51.654 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.064) 0:02:51.718 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.040) 0:02:51.759 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.028) 0:02:51.787 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.030) 0:02:51.818 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:43:46 +0000 (0:00:00.037) 0:02:51.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.036) 0:02:51.892 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.039) 0:02:51.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.030) 0:02:51.962 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.028) 0:02:51.990 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.054) 0:02:52.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.033) 0:02:52.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.031) 0:02:52.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.030) 0:02:52.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.029) 0:02:52.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.029) 0:02:52.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.028) 0:02:52.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.032) 0:02:52.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.029) 0:02:52.290 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.056) 0:02:52.346 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.059) 0:02:52.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.031) 0:02:52.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.032) 0:02:52.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.033) 0:02:52.503 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.061) 0:02:52.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.037) 0:02:52.602 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.027) 0:02:52.629 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.026) 0:02:52.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.030) 0:02:52.687 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.071) 0:02:52.758 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.063) 0:02:52.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:43:47 +0000 (0:00:00.030) 0:02:52.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.030) 0:02:52.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.031) 0:02:52.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.074) 0:02:52.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.030) 0:02:53.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.029) 0:02:53.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.029) 0:02:53.079 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.030) 0:02:53.109 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.029) 0:02:53.139 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.066) 0:02:53.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.039) 0:02:53.245 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.128) 0:02:53.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.038) 0:02:53.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.040) 0:02:53.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.031) 0:02:53.484 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.034) 0:02:53.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.029) 0:02:53.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.031) 0:02:53.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.030) 0:02:53.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.029) 0:02:53.641 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.032) 0:02:53.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.045) 0:02:53.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.025) 0:02:53.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.037) 0:02:53.782 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.031) 0:02:53.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:43:48 +0000 (0:00:00.031) 0:02:53.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.034) 0:02:53.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.026) 0:02:53.905 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.375) 0:02:54.281 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.038) 0:02:54.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.027) 0:02:54.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.033) 0:02:54.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.027) 0:02:54.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.028) 0:02:54.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.024) 0:02:54.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.031) 0:02:54.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.037) 0:02:54.737 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.035) 0:02:54.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.034) 0:02:54.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.029) 0:02:54.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:43:49 +0000 (0:00:00.031) 0:02:54.869 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.034) 0:02:54.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.029) 0:02:54.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.028) 0:02:54.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.030) 0:02:54.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.027) 0:02:55.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.027) 0:02:55.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.027) 0:02:55.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.029) 0:02:55.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.031) 0:02:55.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.034) 0:02:55.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.091) 0:02:55.262 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.034) 0:02:55.297 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.030) 0:02:55.328 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.031) 0:02:55.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.032) 0:02:55.392 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.032) 0:02:55.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.034) 0:02:55.460 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.035) 0:02:55.496 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.034) 0:02:55.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.032) 0:02:55.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.031) 0:02:55.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.038) 0:02:55.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.038) 0:02:55.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.033) 0:02:55.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.030) 0:02:55.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.031) 0:02:55.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.029) 0:02:55.799 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.031) 0:02:55.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:43:50 +0000 (0:00:00.032) 0:02:55.863 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:43:51 +0000 (0:00:00.029) 0:02:55.892 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1080 changed=14 unreachable=0 failed=0 skipped=932 rescued=0 ignored=0 Wednesday 01 June 2022 16:43:51 +0000 (0:00:00.017) 0:02:55.910 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.58s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : set up new/current mounts ------------------ 1.30s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : remove obsolete mounts --------------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : set up new/current mounts ------------------ 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Gathering Facts --------------------------------------------------------- 1.03s /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:2 --------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:43:51 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:43:53 +0000 (0:00:01.282) 0:00:01.304 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_raid_pool_then_remove_nvme_generated.yml **************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:43:53 +0000 (0:00:00.028) 0:00:01.333 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:43:53 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:43:55 +0000 (0:00:01.256) 0:00:01.279 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_raid_pool_then_remove_scsi_generated.yml **************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove_scsi_generated.yml:3 Wednesday 01 June 2022 16:43:55 +0000 (0:00:00.025) 0:00:01.304 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove_scsi_generated.yml:7 Wednesday 01 June 2022 16:43:56 +0000 (0:00:01.071) 0:00:02.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:2 Wednesday 01 June 2022 16:43:56 +0000 (0:00:00.026) 0:00:02.402 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:17 Wednesday 01 June 2022 16:43:57 +0000 (0:00:00.804) 0:00:03.206 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:43:57 +0000 (0:00:00.042) 0:00:03.249 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:43:57 +0000 (0:00:00.154) 0:00:03.404 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:43:57 +0000 (0:00:00.534) 0:00:03.939 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:43:57 +0000 (0:00:00.075) 0:00:04.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:43:57 +0000 (0:00:00.023) 0:00:04.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:43:57 +0000 (0:00:00.025) 0:00:04.063 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:43:58 +0000 (0:00:00.195) 0:00:04.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:43:58 +0000 (0:00:00.019) 0:00:04.278 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:43:59 +0000 (0:00:01.121) 0:00:05.399 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:43:59 +0000 (0:00:00.045) 0:00:05.445 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:43:59 +0000 (0:00:00.043) 0:00:05.489 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:43:59 +0000 (0:00:00.668) 0:00:06.157 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:44:00 +0000 (0:00:00.081) 0:00:06.238 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:44:00 +0000 (0:00:00.020) 0:00:06.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:44:00 +0000 (0:00:00.022) 0:00:06.280 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:44:00 +0000 (0:00:00.020) 0:00:06.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:44:00 +0000 (0:00:00.815) 0:00:07.116 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:44:02 +0000 (0:00:01.771) 0:00:08.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:44:02 +0000 (0:00:00.042) 0:00:08.930 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:44:02 +0000 (0:00:00.027) 0:00:08.958 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.507) 0:00:09.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.028) 0:00:09.494 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.024) 0:00:09.519 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.035) 0:00:09.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.032) 0:00:09.587 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.032) 0:00:09.620 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.027) 0:00:09.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.027) 0:00:09.674 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.025) 0:00:09.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:44:03 +0000 (0:00:00.027) 0:00:09.728 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:44:04 +0000 (0:00:00.493) 0:00:10.221 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:44:04 +0000 (0:00:00.029) 0:00:10.250 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:20 Wednesday 01 June 2022 16:44:05 +0000 (0:00:01.114) 0:00:11.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:27 Wednesday 01 June 2022 16:44:05 +0000 (0:00:00.030) 0:00:11.395 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:44:05 +0000 (0:00:00.047) 0:00:11.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:44:05 +0000 (0:00:00.521) 0:00:11.964 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:44:05 +0000 (0:00:00.036) 0:00:12.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:44:05 +0000 (0:00:00.029) 0:00:12.031 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a RAID0 device] *************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:31 Wednesday 01 June 2022 16:44:05 +0000 (0:00:00.030) 0:00:12.062 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:44:05 +0000 (0:00:00.060) 0:00:12.122 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.083) 0:00:12.205 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.502) 0:00:12.707 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.068) 0:00:12.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.028) 0:00:12.805 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.034) 0:00:12.840 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.062) 0:00:12.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.025) 0:00:12.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.028) 0:00:12.956 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "raid0", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.039) 0:00:12.996 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.031) 0:00:13.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.028) 0:00:13.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.028) 0:00:13.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.028) 0:00:13.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:44:06 +0000 (0:00:00.029) 0:00:13.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:44:07 +0000 (0:00:00.042) 0:00:13.185 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:44:07 +0000 (0:00:00.026) 0:00:13.212 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools", "mdadm" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:44:15 +0000 (0:00:08.466) 0:00:21.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:44:15 +0000 (0:00:00.031) 0:00:21.710 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:44:15 +0000 (0:00:00.028) 0:00:21.738 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools", "mdadm" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:44:15 +0000 (0:00:00.046) 0:00:21.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:44:15 +0000 (0:00:00.046) 0:00:21.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:44:15 +0000 (0:00:00.035) 0:00:21.867 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:44:15 +0000 (0:00:00.030) 0:00:21.898 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:44:16 +0000 (0:00:00.982) 0:00:22.880 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:44:17 +0000 (0:00:01.268) 0:00:24.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:44:18 +0000 (0:00:00.632) 0:00:24.781 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:44:18 +0000 (0:00:00.353) 0:00:25.135 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:44:19 +0000 (0:00:00.068) 0:00:25.204 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:52 Wednesday 01 June 2022 16:44:19 +0000 (0:00:00.883) 0:00:26.087 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:44:19 +0000 (0:00:00.058) 0:00:26.145 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:44:20 +0000 (0:00:00.046) 0:00:26.192 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:44:20 +0000 (0:00:00.031) 0:00:26.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "1539456e-c27c-42ab-a291-4a622e87165d" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "0481c3ad-d544-49bc-802d-7ebd4c631e89" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "e21a394b-616e-4934-9306-27ee282735af" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "20G", "type": "raid0", "uuid": "pO5CzD-hwEe-r3xp-GsGR-8KWO-IRCm-9UKNyW" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "2c5ca47e-ea4f-06f6-f981-790b2456af03" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "2c5ca47e-ea4f-06f6-f981-790b2456af03" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:44:20 +0000 (0:00:00.471) 0:00:26.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002561", "end": "2022-06-01 12:44:20.393586", "rc": 0, "start": "2022-06-01 12:44:20.391025" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:44:20 +0000 (0:00:00.466) 0:00:27.162 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002384", "end": "2022-06-01 12:44:20.754378", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:44:20.751994" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:44:21 +0000 (0:00:00.366) 0:00:27.529 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:44:21 +0000 (0:00:00.075) 0:00:27.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:44:21 +0000 (0:00:00.031) 0:00:27.635 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:44:21 +0000 (0:00:00.071) 0:00:27.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:44:21 +0000 (0:00:00.044) 0:00:27.751 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.523) 0:00:28.275 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.044) 0:00:28.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.039) 0:00:28.359 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.036) 0:00:28.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.036) 0:00:28.432 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid0" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.036) 0:00:28.469 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.042) 0:00:28.511 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.101) 0:00:28.612 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.008133", "end": "2022-06-01 12:44:22.239992", "rc": 0, "start": "2022-06-01 12:44:22.231859" } STDOUT: /dev/md/vg1-1: Version : 1.2 Creation Time : Wed Jun 1 12:44:08 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:44:08 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : vg1-1 UUID : 2c5ca47e:ea4f06f6:f981790b:2456af03 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.397) 0:00:29.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.045) 0:00:29.056 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.043) 0:00:29.100 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ None\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.042) 0:00:29.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:44:22 +0000 (0:00:00.034) 0:00:29.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.033) 0:00:29.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.035) 0:00:29.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.033) 0:00:29.279 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.063) 0:00:29.342 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.090) 0:00:29.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.032) 0:00:29.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.032) 0:00:29.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.030) 0:00:29.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.029) 0:00:29.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.030) 0:00:29.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.030) 0:00:29.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.030) 0:00:29.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.032) 0:00:29.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.030) 0:00:29.713 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.063) 0:00:29.777 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.036) 0:00:29.813 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.037) 0:00:29.851 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.060) 0:00:29.911 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.036) 0:00:29.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.036) 0:00:29.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.032) 0:00:30.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.030) 0:00:30.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.030) 0:00:30.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.032) 0:00:30.110 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:44:23 +0000 (0:00:00.032) 0:00:30.142 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.065) 0:00:30.207 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.099) 0:00:30.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.031) 0:00:30.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.032) 0:00:30.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.031) 0:00:30.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.038) 0:00:30.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.036) 0:00:30.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.037) 0:00:30.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.035) 0:00:30.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.032) 0:00:30.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.033) 0:00:30.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.033) 0:00:30.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.032) 0:00:30.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.039) 0:00:30.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.090) 0:00:30.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.034) 0:00:30.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.035) 0:00:30.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.033) 0:00:30.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.033) 0:00:30.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.033) 0:00:30.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.036) 0:00:31.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.033) 0:00:31.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.032) 0:00:31.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.030) 0:00:31.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.031) 0:00:31.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:44:24 +0000 (0:00:00.033) 0:00:31.181 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.035) 0:00:31.216 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.082) 0:00:31.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.035) 0:00:31.335 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.140) 0:00:31.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.038) 0:00:31.514 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "1539456e-c27c-42ab-a291-4a622e87165d" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "1539456e-c27c-42ab-a291-4a622e87165d" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.043) 0:00:31.558 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.039) 0:00:31.597 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.038) 0:00:31.636 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.039) 0:00:31.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.032) 0:00:31.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.033) 0:00:31.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.032) 0:00:31.773 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.032) 0:00:31.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.051) 0:00:31.858 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.035) 0:00:31.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.037) 0:00:31.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.030) 0:00:31.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.033) 0:00:31.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.039) 0:00:32.034 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:44:25 +0000 (0:00:00.042) 0:00:32.076 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101854.8271215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101854.8271215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7262, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101854.8271215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.372) 0:00:32.449 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.038) 0:00:32.487 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.036) 0:00:32.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.033) 0:00:32.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.031) 0:00:32.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.035) 0:00:32.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.034) 0:00:32.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.033) 0:00:32.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.033) 0:00:32.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.039) 0:00:32.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.031) 0:00:32.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.030) 0:00:32.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.035) 0:00:32.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.032) 0:00:32.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.031) 0:00:32.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.039) 0:00:32.966 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.042) 0:00:33.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.034) 0:00:33.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.088) 0:00:33.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:44:26 +0000 (0:00:00.033) 0:00:33.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.035) 0:00:33.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.033) 0:00:33.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.032) 0:00:33.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.032) 0:00:33.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.034) 0:00:33.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.032) 0:00:33.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.031) 0:00:33.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.031) 0:00:33.428 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:44:27 +0000 (0:00:00.461) 0:00:33.890 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.387) 0:00:34.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.041) 0:00:34.318 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.037) 0:00:34.356 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.033) 0:00:34.389 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.032) 0:00:34.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.032) 0:00:34.455 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.032) 0:00:34.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.032) 0:00:34.520 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.038) 0:00:34.559 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.035) 0:00:34.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.040) 0:00:34.635 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.037327", "end": "2022-06-01 12:44:28.283537", "rc": 0, "start": "2022-06-01 12:44:28.246210" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.418) 0:00:35.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.040) 0:00:35.093 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.043) 0:00:35.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:44:28 +0000 (0:00:00.035) 0:00:35.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.035) 0:00:35.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.035) 0:00:35.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.033) 0:00:35.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.031) 0:00:35.308 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.038) 0:00:35.347 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.123) 0:00:35.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.037) 0:00:35.508 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "0481c3ad-d544-49bc-802d-7ebd4c631e89" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "0481c3ad-d544-49bc-802d-7ebd4c631e89" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.043) 0:00:35.551 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.038) 0:00:35.590 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.035) 0:00:35.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.040) 0:00:35.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.031) 0:00:35.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.031) 0:00:35.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.031) 0:00:35.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.032) 0:00:35.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.050) 0:00:35.843 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.079) 0:00:35.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.037) 0:00:35.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.031) 0:00:35.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.033) 0:00:36.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.040) 0:00:36.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:44:29 +0000 (0:00:00.043) 0:00:36.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101854.5681214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101854.5681214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7228, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101854.5681214, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.378) 0:00:36.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.040) 0:00:36.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.037) 0:00:36.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.036) 0:00:36.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.032) 0:00:36.635 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.036) 0:00:36.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.035) 0:00:36.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.032) 0:00:36.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.032) 0:00:36.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.039) 0:00:36.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.032) 0:00:36.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.032) 0:00:36.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.034) 0:00:36.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.032) 0:00:36.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.032) 0:00:36.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.039) 0:00:37.018 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.036) 0:00:37.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.031) 0:00:37.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.042) 0:00:37.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:44:30 +0000 (0:00:00.036) 0:00:37.166 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.034) 0:00:37.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.033) 0:00:37.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.032) 0:00:37.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.032) 0:00:37.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.034) 0:00:37.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.033) 0:00:37.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.032) 0:00:37.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.032) 0:00:37.433 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:44:31 +0000 (0:00:00.387) 0:00:37.820 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.372) 0:00:38.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.039) 0:00:38.233 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.035) 0:00:38.268 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.032) 0:00:38.300 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.033) 0:00:38.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.033) 0:00:38.367 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.032) 0:00:38.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.032) 0:00:38.432 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.036) 0:00:38.468 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.034) 0:00:38.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.043) 0:00:38.546 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.032467", "end": "2022-06-01 12:44:32.188474", "rc": 0, "start": "2022-06-01 12:44:32.156007" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.466) 0:00:39.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.044) 0:00:39.057 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.046) 0:00:39.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.035) 0:00:39.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:44:32 +0000 (0:00:00.036) 0:00:39.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.034) 0:00:39.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.036) 0:00:39.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.034) 0:00:39.281 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.038) 0:00:39.320 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.130) 0:00:39.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.038) 0:00:39.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "e21a394b-616e-4934-9306-27ee282735af" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "e21a394b-616e-4934-9306-27ee282735af" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.042) 0:00:39.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.037) 0:00:39.570 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.035) 0:00:39.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.037) 0:00:39.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.031) 0:00:39.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.033) 0:00:39.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.032) 0:00:39.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.031) 0:00:39.772 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.048) 0:00:39.820 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.035) 0:00:39.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.041) 0:00:39.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.032) 0:00:39.930 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.032) 0:00:39.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.037) 0:00:39.999 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:44:33 +0000 (0:00:00.037) 0:00:40.037 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101854.3291216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101854.3291216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7193, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101854.3291216, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.404) 0:00:40.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.043) 0:00:40.485 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.041) 0:00:40.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.035) 0:00:40.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.031) 0:00:40.593 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.036) 0:00:40.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.031) 0:00:40.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.031) 0:00:40.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.033) 0:00:40.726 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.038) 0:00:40.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.030) 0:00:40.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.030) 0:00:40.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.032) 0:00:40.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.030) 0:00:40.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.035) 0:00:40.924 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.040) 0:00:40.965 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.036) 0:00:41.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.031) 0:00:41.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.031) 0:00:41.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.031) 0:00:41.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.036) 0:00:41.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:44:34 +0000 (0:00:00.032) 0:00:41.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:44:35 +0000 (0:00:00.031) 0:00:41.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:44:35 +0000 (0:00:00.032) 0:00:41.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:44:35 +0000 (0:00:00.034) 0:00:41.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:44:35 +0000 (0:00:00.031) 0:00:41.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:44:35 +0000 (0:00:00.035) 0:00:41.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:44:35 +0000 (0:00:00.072) 0:00:41.404 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:44:35 +0000 (0:00:00.418) 0:00:41.822 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.372) 0:00:42.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.039) 0:00:42.235 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.037) 0:00:42.273 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.032) 0:00:42.305 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.031) 0:00:42.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.034) 0:00:42.370 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.037) 0:00:42.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.032) 0:00:42.441 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.036) 0:00:42.478 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.035) 0:00:42.513 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.042) 0:00:42.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.041924", "end": "2022-06-01 12:44:36.206688", "rc": 0, "start": "2022-06-01 12:44:36.164764" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.431) 0:00:42.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.042) 0:00:43.030 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.042) 0:00:43.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.035) 0:00:43.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.034) 0:00:43.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:44:36 +0000 (0:00:00.035) 0:00:43.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.037) 0:00:43.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.033) 0:00:43.248 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.032) 0:00:43.281 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.030) 0:00:43.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:54 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.034) 0:00:43.345 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.068) 0:00:43.414 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.048) 0:00:43.463 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.518) 0:00:43.981 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.080) 0:00:44.062 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.040) 0:00:44.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:44:37 +0000 (0:00:00.034) 0:00:44.137 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.068) 0:00:44.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.028) 0:00:44.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.032) 0:00:44.266 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "raid0", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.042) 0:00:44.309 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.036) 0:00:44.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.035) 0:00:44.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.077) 0:00:44.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.032) 0:00:44.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.031) 0:00:44.523 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.047) 0:00:44.571 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:44:38 +0000 (0:00:00.030) 0:00:44.601 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "mdadm", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:44:40 +0000 (0:00:02.038) 0:00:46.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:44:40 +0000 (0:00:00.035) 0:00:46.676 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:44:40 +0000 (0:00:00.032) 0:00:46.708 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "mdadm", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:44:40 +0000 (0:00:00.051) 0:00:46.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:44:40 +0000 (0:00:00.047) 0:00:46.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:44:40 +0000 (0:00:00.036) 0:00:46.844 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:44:40 +0000 (0:00:00.033) 0:00:46.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:44:41 +0000 (0:00:00.669) 0:00:47.547 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:44:42 +0000 (0:00:01.079) 0:00:48.627 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:44:43 +0000 (0:00:00.668) 0:00:49.295 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:44:43 +0000 (0:00:00.363) 0:00:49.659 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:44:43 +0000 (0:00:00.030) 0:00:49.690 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:75 Wednesday 01 June 2022 16:44:44 +0000 (0:00:00.893) 0:00:50.583 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:44:44 +0000 (0:00:00.060) 0:00:50.643 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:44:44 +0000 (0:00:00.046) 0:00:50.690 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:44:44 +0000 (0:00:00.032) 0:00:50.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "1539456e-c27c-42ab-a291-4a622e87165d" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "0481c3ad-d544-49bc-802d-7ebd4c631e89" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "e21a394b-616e-4934-9306-27ee282735af" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "20G", "type": "raid0", "uuid": "pO5CzD-hwEe-r3xp-GsGR-8KWO-IRCm-9UKNyW" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "2c5ca47e-ea4f-06f6-f981-790b2456af03" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "2c5ca47e-ea4f-06f6-f981-790b2456af03" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:44:44 +0000 (0:00:00.374) 0:00:51.097 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002575", "end": "2022-06-01 12:44:44.704661", "rc": 0, "start": "2022-06-01 12:44:44.702086" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:44:45 +0000 (0:00:00.381) 0:00:51.479 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002621", "end": "2022-06-01 12:44:45.066432", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:44:45.063811" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:44:45 +0000 (0:00:00.359) 0:00:51.838 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:44:45 +0000 (0:00:00.078) 0:00:51.917 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:44:45 +0000 (0:00:00.031) 0:00:51.949 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:44:45 +0000 (0:00:00.066) 0:00:52.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:44:45 +0000 (0:00:00.047) 0:00:52.063 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.382) 0:00:52.446 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.045) 0:00:52.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.040) 0:00:52.532 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.040) 0:00:52.573 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.039) 0:00:52.612 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid0" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.038) 0:00:52.650 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.047) 0:00:52.697 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.064) 0:00:52.762 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.006434", "end": "2022-06-01 12:44:46.389273", "rc": 0, "start": "2022-06-01 12:44:46.382839" } STDOUT: /dev/md/vg1-1: Version : 1.2 Creation Time : Wed Jun 1 12:44:08 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:44:08 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : vg1-1 UUID : 2c5ca47e:ea4f06f6:f981790b:2456af03 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:44:46 +0000 (0:00:00.398) 0:00:53.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.040) 0:00:53.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 0\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.040) 0:00:53.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.2\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.042) 0:00:53.284 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.046) 0:00:53.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.044) 0:00:53.375 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.043) 0:00:53.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.030) 0:00:53.448 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.060) 0:00:53.508 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.085) 0:00:53.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.032) 0:00:53.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.032) 0:00:53.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.029) 0:00:53.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.031) 0:00:53.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.030) 0:00:53.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.030) 0:00:53.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.032) 0:00:53.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.078) 0:00:53.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.033) 0:00:53.926 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.064) 0:00:53.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.041) 0:00:54.032 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.044) 0:00:54.077 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:44:47 +0000 (0:00:00.065) 0:00:54.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.042) 0:00:54.185 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.044) 0:00:54.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.029) 0:00:54.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.030) 0:00:54.322 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.034) 0:00:54.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.389 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.072) 0:00:54.462 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.101) 0:00:54.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.034) 0:00:54.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.031) 0:00:54.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.034) 0:00:54.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.031) 0:00:54.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.032) 0:00:54.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.035) 0:00:54.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.034) 0:00:54.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.031) 0:00:54.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.030) 0:00:55.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.030) 0:00:55.054 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.033) 0:00:55.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.051) 0:00:55.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:44:48 +0000 (0:00:00.035) 0:00:55.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.033) 0:00:55.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.031) 0:00:55.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.030) 0:00:55.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.029) 0:00:55.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.032) 0:00:55.333 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.030) 0:00:55.363 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.031) 0:00:55.394 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.085) 0:00:55.480 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.038) 0:00:55.519 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.127) 0:00:55.646 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.037) 0:00:55.684 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "1539456e-c27c-42ab-a291-4a622e87165d" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 508753, "block_size": 4096, "block_total": 520704, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 2083852288, "size_total": 2132803584, "uuid": "1539456e-c27c-42ab-a291-4a622e87165d" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.045) 0:00:55.729 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.039) 0:00:55.769 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.101) 0:00:55.870 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.051) 0:00:55.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.035) 0:00:55.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.037) 0:00:55.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.035) 0:00:56.030 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.037) 0:00:56.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.052) 0:00:56.120 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:44:49 +0000 (0:00:00.037) 0:00:56.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.040) 0:00:56.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.033) 0:00:56.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.034) 0:00:56.267 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.042) 0:00:56.310 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.040) 0:00:56.350 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101854.8271215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101854.8271215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7262, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101854.8271215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.379) 0:00:56.730 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.040) 0:00:56.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.038) 0:00:56.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.036) 0:00:56.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.032) 0:00:56.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.038) 0:00:56.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.032) 0:00:56.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.032) 0:00:56.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.032) 0:00:57.013 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.039) 0:00:57.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.032) 0:00:57.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.037) 0:00:57.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:44:50 +0000 (0:00:00.033) 0:00:57.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.033) 0:00:57.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.033) 0:00:57.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.039) 0:00:57.263 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.037) 0:00:57.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.033) 0:00:57.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.397 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.032) 0:00:57.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.033) 0:00:57.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.031) 0:00:57.653 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:44:51 +0000 (0:00:00.372) 0:00:58.026 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.360) 0:00:58.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.040) 0:00:58.427 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.034) 0:00:58.462 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.033) 0:00:58.496 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.032) 0:00:58.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.031) 0:00:58.560 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.031) 0:00:58.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.033) 0:00:58.625 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.036) 0:00:58.662 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.038) 0:00:58.700 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.047) 0:00:58.748 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.030945", "end": "2022-06-01 12:44:52.387604", "rc": 0, "start": "2022-06-01 12:44:52.356659" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:44:52 +0000 (0:00:00.414) 0:00:59.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.042) 0:00:59.206 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.045) 0:00:59.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.036) 0:00:59.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.101) 0:00:59.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.036) 0:00:59.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.034) 0:00:59.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.032) 0:00:59.492 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.036) 0:00:59.528 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.126) 0:00:59.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.038) 0:00:59.693 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "0481c3ad-d544-49bc-802d-7ebd4c631e89" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "0481c3ad-d544-49bc-802d-7ebd4c631e89" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.043) 0:00:59.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.038) 0:00:59.775 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.036) 0:00:59.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.038) 0:00:59.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.034) 0:00:59.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.031) 0:00:59.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.034) 0:00:59.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.033) 0:00:59.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.047) 0:01:00.031 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.036) 0:01:00.068 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.039) 0:01:00.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.040) 0:01:00.148 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:44:53 +0000 (0:00:00.033) 0:01:00.182 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.040) 0:01:00.223 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.038) 0:01:00.262 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101854.5681214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101854.5681214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7228, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101854.5681214, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.371) 0:01:00.633 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.039) 0:01:00.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.038) 0:01:00.711 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.037) 0:01:00.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.033) 0:01:00.782 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.037) 0:01:00.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.031) 0:01:00.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.032) 0:01:00.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.033) 0:01:00.918 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.042) 0:01:00.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.032) 0:01:00.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.032) 0:01:01.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.031) 0:01:01.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.031) 0:01:01.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.033) 0:01:01.122 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:44:54 +0000 (0:00:00.038) 0:01:01.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.035) 0:01:01.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.031) 0:01:01.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.032) 0:01:01.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.031) 0:01:01.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.035) 0:01:01.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.035) 0:01:01.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.033) 0:01:01.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.033) 0:01:01.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.032) 0:01:01.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.032) 0:01:01.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.039) 0:01:01.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.038) 0:01:01.573 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:44:55 +0000 (0:00:00.441) 0:01:02.014 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.371) 0:01:02.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.039) 0:01:02.426 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.035) 0:01:02.461 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.034) 0:01:02.495 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.031) 0:01:02.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.030) 0:01:02.557 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.035) 0:01:02.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.034) 0:01:02.627 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.038) 0:01:02.666 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.036) 0:01:02.703 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.042) 0:01:02.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.035068", "end": "2022-06-01 12:44:56.405780", "rc": 0, "start": "2022-06-01 12:44:56.370712" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:44:56 +0000 (0:00:00.433) 0:01:03.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.039) 0:01:03.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.039) 0:01:03.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.035) 0:01:03.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.034) 0:01:03.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.033) 0:01:03.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.033) 0:01:03.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.035) 0:01:03.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.037) 0:01:03.467 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.125) 0:01:03.593 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.036) 0:01:03.629 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "e21a394b-616e-4934-9306-27ee282735af" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 769066, "block_size": 4096, "block_total": 782848, "block_used": 13782, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1570813, "inode_total": 1570816, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 3150094336, "size_total": 3206545408, "uuid": "e21a394b-616e-4934-9306-27ee282735af" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.043) 0:01:03.673 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.040) 0:01:03.714 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.036) 0:01:03.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.039) 0:01:03.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.031) 0:01:03.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.032) 0:01:03.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.031) 0:01:03.886 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.034) 0:01:03.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.047) 0:01:03.967 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.035) 0:01:04.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.039) 0:01:04.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.038) 0:01:04.080 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.034) 0:01:04.115 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:44:57 +0000 (0:00:00.040) 0:01:04.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.038) 0:01:04.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101854.3291216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101854.3291216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7193, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101854.3291216, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.395) 0:01:04.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.038) 0:01:04.628 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.040) 0:01:04.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.035) 0:01:04.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.031) 0:01:04.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.082) 0:01:04.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.033) 0:01:04.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.033) 0:01:04.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.032) 0:01:04.918 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.039) 0:01:04.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.033) 0:01:04.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.035) 0:01:05.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.032) 0:01:05.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.033) 0:01:05.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.032) 0:01:05.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:44:58 +0000 (0:00:00.041) 0:01:05.167 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.041) 0:01:05.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.032) 0:01:05.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.034) 0:01:05.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.033) 0:01:05.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.034) 0:01:05.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.033) 0:01:05.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.035) 0:01:05.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.042) 0:01:05.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.036) 0:01:05.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.035) 0:01:05.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.034) 0:01:05.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.034) 0:01:05.596 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:44:59 +0000 (0:00:00.362) 0:01:05.958 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.369) 0:01:06.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.040) 0:01:06.367 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.037) 0:01:06.405 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.033) 0:01:06.439 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.032) 0:01:06.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.033) 0:01:06.504 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.033) 0:01:06.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.033) 0:01:06.571 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.040) 0:01:06.611 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.034) 0:01:06.646 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.041) 0:01:06.687 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.034707", "end": "2022-06-01 12:45:00.324566", "rc": 0, "start": "2022-06-01 12:45:00.289859" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.407) 0:01:07.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.043) 0:01:07.139 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:45:00 +0000 (0:00:00.042) 0:01:07.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.036) 0:01:07.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.034) 0:01:07.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.034) 0:01:07.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.034) 0:01:07.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.033) 0:01:07.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.032) 0:01:07.387 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.032) 0:01:07.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the device created above] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:77 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.031) 0:01:07.452 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.079) 0:01:07.532 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.094) 0:01:07.627 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:45:01 +0000 (0:00:00.557) 0:01:08.184 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.074) 0:01:08.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.033) 0:01:08.291 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.048) 0:01:08.340 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.066) 0:01:08.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.030) 0:01:08.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.034) 0:01:08.470 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "raid0", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.042) 0:01:08.513 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.034) 0:01:08.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.032) 0:01:08.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.031) 0:01:08.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.033) 0:01:08.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.031) 0:01:08.677 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.057) 0:01:08.734 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:45:02 +0000 (0:00:00.031) 0:01:08.766 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:45:06 +0000 (0:00:03.650) 0:01:12.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:45:06 +0000 (0:00:00.032) 0:01:12.449 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:45:06 +0000 (0:00:00.030) 0:01:12.480 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:45:06 +0000 (0:00:00.050) 0:01:12.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:45:06 +0000 (0:00:00.044) 0:01:12.575 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:45:06 +0000 (0:00:00.038) 0:01:12.614 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:45:07 +0000 (0:00:01.101) 0:01:13.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:45:08 +0000 (0:00:00.698) 0:01:14.414 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:45:08 +0000 (0:00:00.033) 0:01:14.448 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:45:08 +0000 (0:00:00.707) 0:01:15.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:45:09 +0000 (0:00:00.368) 0:01:15.524 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:45:09 +0000 (0:00:00.034) 0:01:15.559 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:98 Wednesday 01 June 2022 16:45:10 +0000 (0:00:00.818) 0:01:16.377 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:45:10 +0000 (0:00:00.060) 0:01:16.438 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:45:10 +0000 (0:00:00.044) 0:01:16.482 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:45:10 +0000 (0:00:00.027) 0:01:16.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:45:10 +0000 (0:00:00.375) 0:01:16.885 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002655", "end": "2022-06-01 12:45:10.487226", "rc": 0, "start": "2022-06-01 12:45:10.484571" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.373) 0:01:17.258 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002489", "end": "2022-06-01 12:45:10.852003", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:45:10.849514" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.368) 0:01:17.627 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.080) 0:01:17.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.033) 0:01:17.741 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.066) 0:01:17.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.045) 0:01:17.852 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.030) 0:01:17.883 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.029) 0:01:17.912 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.039) 0:01:17.952 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.037) 0:01:17.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.039) 0:01:18.029 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid0" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.035) 0:01:18.065 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.027) 0:01:18.092 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.055) 0:01:18.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:45:11 +0000 (0:00:00.031) 0:01:18.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.033) 0:01:18.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.032) 0:01:18.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.081) 0:01:18.449 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.058) 0:01:18.507 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.085) 0:01:18.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.029) 0:01:18.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.032) 0:01:18.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.034) 0:01:18.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.034) 0:01:18.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.032) 0:01:18.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:18.879 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.058) 0:01:18.938 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.038) 0:01:18.977 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.030) 0:01:19.007 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.033) 0:01:19.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.041) 0:01:19.083 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:45:12 +0000 (0:00:00.067) 0:01:19.150 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.101) 0:01:19.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.031) 0:01:19.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.035) 0:01:19.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.031) 0:01:19.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.030) 0:01:19.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.470 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.032) 0:01:19.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.030) 0:01:19.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.032) 0:01:19.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.032) 0:01:19.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:19.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.028) 0:01:19.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.028) 0:01:19.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.031) 0:01:19.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.030) 0:01:19.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.032) 0:01:19.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.030) 0:01:19.957 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.030) 0:01:19.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.029) 0:01:20.017 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.081) 0:01:20.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:45:13 +0000 (0:00:00.036) 0:01:20.135 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.127) 0:01:20.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.038) 0:01:20.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.094) 0:01:20.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.033) 0:01:20.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.048) 0:01:20.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.033) 0:01:20.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.032) 0:01:20.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.032) 0:01:20.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.033) 0:01:20.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.033) 0:01:20.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.047) 0:01:20.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.026) 0:01:20.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.034) 0:01:20.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.029) 0:01:20.780 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.032) 0:01:20.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.029) 0:01:20.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:45:14 +0000 (0:00:00.025) 0:01:20.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.372) 0:01:21.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.038) 0:01:21.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.026) 0:01:21.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.033) 0:01:21.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.031) 0:01:21.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.026) 0:01:21.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.029) 0:01:21.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.033) 0:01:21.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.031) 0:01:21.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.026) 0:01:21.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.032) 0:01:21.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.030) 0:01:21.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.030) 0:01:21.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.029) 0:01:21.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.029) 0:01:21.667 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.036) 0:01:21.704 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.037) 0:01:21.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.033) 0:01:21.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.033) 0:01:21.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.029) 0:01:21.838 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.033) 0:01:21.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.030) 0:01:21.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.033) 0:01:21.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.031) 0:01:21.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.031) 0:01:21.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.040) 0:01:22.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.031) 0:01:22.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.032) 0:01:22.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.034) 0:01:22.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:45:15 +0000 (0:00:00.031) 0:01:22.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.030) 0:01:22.199 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.035) 0:01:22.234 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.031) 0:01:22.265 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.029) 0:01:22.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.033) 0:01:22.328 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.029) 0:01:22.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.028) 0:01:22.387 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.034) 0:01:22.421 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.032) 0:01:22.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.028) 0:01:22.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.033) 0:01:22.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.035) 0:01:22.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.034) 0:01:22.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.030) 0:01:22.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.029) 0:01:22.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.032) 0:01:22.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.032) 0:01:22.711 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.087) 0:01:22.798 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.039) 0:01:22.837 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.124) 0:01:22.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.036) 0:01:22.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.039) 0:01:23.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.029) 0:01:23.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.036) 0:01:23.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.030) 0:01:23.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:45:16 +0000 (0:00:00.030) 0:01:23.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.030) 0:01:23.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.032) 0:01:23.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.030) 0:01:23.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.048) 0:01:23.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.026) 0:01:23.333 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.035) 0:01:23.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.029) 0:01:23.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.031) 0:01:23.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.029) 0:01:23.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.029) 0:01:23.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.380) 0:01:23.870 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.038) 0:01:23.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.026) 0:01:23.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.033) 0:01:23.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.029) 0:01:23.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.024) 0:01:24.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.034) 0:01:24.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.036) 0:01:24.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.029) 0:01:24.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.026) 0:01:24.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:45:17 +0000 (0:00:00.029) 0:01:24.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.032) 0:01:24.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.028) 0:01:24.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.030) 0:01:24.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.029) 0:01:24.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.040) 0:01:24.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.041) 0:01:24.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.030) 0:01:24.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.031) 0:01:24.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.032) 0:01:24.478 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.031) 0:01:24.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.030) 0:01:24.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.029) 0:01:24.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.029) 0:01:24.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.030) 0:01:24.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.032) 0:01:24.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.030) 0:01:24.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.030) 0:01:24.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.029) 0:01:24.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.037) 0:01:24.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.041) 0:01:24.832 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.042) 0:01:24.874 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.031) 0:01:24.906 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.029) 0:01:24.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.032) 0:01:24.967 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.030) 0:01:24.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.031) 0:01:25.029 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.091) 0:01:25.120 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:45:18 +0000 (0:00:00.035) 0:01:25.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.031) 0:01:25.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.030) 0:01:25.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.028) 0:01:25.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.034) 0:01:25.459 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.125) 0:01:25.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.037) 0:01:25.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.039) 0:01:25.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.691 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.034) 0:01:25.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.028) 0:01:25.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.031) 0:01:25.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:25.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.030) 0:01:25.876 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.045) 0:01:25.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.024) 0:01:25.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.034) 0:01:25.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.031) 0:01:26.011 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.029) 0:01:26.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.028) 0:01:26.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:45:19 +0000 (0:00:00.024) 0:01:26.093 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.374) 0:01:26.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.037) 0:01:26.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.026) 0:01:26.532 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.030) 0:01:26.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.031) 0:01:26.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.025) 0:01:26.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.029) 0:01:26.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.029) 0:01:26.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.029) 0:01:26.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.025) 0:01:26.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.033) 0:01:26.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.029) 0:01:26.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.031) 0:01:26.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.030) 0:01:26.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.029) 0:01:26.887 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.037) 0:01:26.924 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.038) 0:01:26.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.030) 0:01:26.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.029) 0:01:27.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.028) 0:01:27.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.030) 0:01:27.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.030) 0:01:27.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.034) 0:01:27.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:45:20 +0000 (0:00:00.030) 0:01:27.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.030) 0:01:27.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.031) 0:01:27.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.036) 0:01:27.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.032) 0:01:27.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.082) 0:01:27.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.032) 0:01:27.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.030) 0:01:27.451 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.036) 0:01:27.487 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.028) 0:01:27.516 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.028) 0:01:27.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.030) 0:01:27.575 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.028) 0:01:27.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.029) 0:01:27.633 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.036) 0:01:27.669 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.033) 0:01:27.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.029) 0:01:27.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.032) 0:01:27.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.030) 0:01:27.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.031) 0:01:27.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.030) 0:01:27.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.031) 0:01:27.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.031) 0:01:27.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.033) 0:01:27.954 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.031) 0:01:27.986 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.033) 0:01:28.019 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.028) 0:01:28.048 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create a RAID1 lvm raid device] ****************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:100 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.031) 0:01:28.079 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:45:21 +0000 (0:00:00.083) 0:01:28.163 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.043) 0:01:28.207 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.516) 0:01:28.723 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.080) 0:01:28.804 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.029) 0:01:28.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.031) 0:01:28.865 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.062) 0:01:28.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.026) 0:01:28.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.032) 0:01:28.986 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.040) 0:01:29.026 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.044) 0:01:29.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.034) 0:01:29.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.031) 0:01:29.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:45:22 +0000 (0:00:00.030) 0:01:29.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:45:23 +0000 (0:00:00.029) 0:01:29.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:45:23 +0000 (0:00:00.044) 0:01:29.241 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:45:23 +0000 (0:00:00.028) 0:01:29.270 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:45:26 +0000 (0:00:02.960) 0:01:32.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:45:26 +0000 (0:00:00.031) 0:01:32.262 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:45:26 +0000 (0:00:00.028) 0:01:32.290 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:45:26 +0000 (0:00:00.041) 0:01:32.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:45:26 +0000 (0:00:00.039) 0:01:32.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:45:26 +0000 (0:00:00.033) 0:01:32.405 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:45:26 +0000 (0:00:00.028) 0:01:32.433 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:45:26 +0000 (0:00:00.694) 0:01:33.127 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:45:27 +0000 (0:00:00.438) 0:01:33.566 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:45:28 +0000 (0:00:00.834) 0:01:34.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:45:28 +0000 (0:00:00.375) 0:01:34.776 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:45:28 +0000 (0:00:00.030) 0:01:34.806 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:116 Wednesday 01 June 2022 16:45:29 +0000 (0:00:00.921) 0:01:35.727 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:45:29 +0000 (0:00:00.081) 0:01:35.809 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:45:29 +0000 (0:00:00.044) 0:01:35.854 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:45:29 +0000 (0:00:00.032) 0:01:35.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "6cf3967d-8ed2-4949-ae0f-d16d4c9e0d71" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_0", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_1", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "CjZKig-XXjg-vbHY-iQkb-ISMr-fcF0-fG4BpY" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "IL9YJH-Ryd1-TjQH-mgnE-hLh5-ebs9-SUbkyq" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:45:30 +0000 (0:00:00.407) 0:01:36.294 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002595", "end": "2022-06-01 12:45:29.893972", "rc": 0, "start": "2022-06-01 12:45:29.891377" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:45:30 +0000 (0:00:00.374) 0:01:36.669 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002892", "end": "2022-06-01 12:45:30.284224", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:45:30.281332" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:45:30 +0000 (0:00:00.391) 0:01:37.060 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:45:30 +0000 (0:00:00.066) 0:01:37.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:45:31 +0000 (0:00:00.076) 0:01:37.203 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:45:31 +0000 (0:00:00.068) 0:01:37.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:45:31 +0000 (0:00:00.039) 0:01:37.312 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:45:31 +0000 (0:00:00.749) 0:01:38.061 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:45:31 +0000 (0:00:00.054) 0:01:38.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:45:31 +0000 (0:00:00.041) 0:01:38.157 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.038) 0:01:38.196 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.038) 0:01:38.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.031) 0:01:38.265 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.054) 0:01:38.320 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.059) 0:01:38.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.033) 0:01:38.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.033) 0:01:38.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.034) 0:01:38.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.032) 0:01:38.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.032) 0:01:38.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.033) 0:01:38.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.032) 0:01:38.612 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.032) 0:01:38.645 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.065) 0:01:38.711 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:45:32 +0000 (0:00:00.075) 0:01:38.787 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid1", "vg1" ], "delta": "0:00:00.038858", "end": "2022-06-01 12:45:32.454158", "rc": 0, "start": "2022-06-01 12:45:32.415300" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.451) 0:01:39.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.049) 0:01:39.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.044) 0:01:39.331 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.069) 0:01:39.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.041) 0:01:39.442 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.044) 0:01:39.486 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.071) 0:01:39.558 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.037) 0:01:39.596 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.037) 0:01:39.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.031) 0:01:39.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.031) 0:01:39.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.030) 0:01:39.727 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.034) 0:01:39.762 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.040) 0:01:39.803 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.083) 0:01:39.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.032) 0:01:39.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.031) 0:01:39.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.029) 0:01:39.980 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.030) 0:01:40.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.033) 0:01:40.044 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.069) 0:01:40.113 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:45:33 +0000 (0:00:00.069) 0:01:40.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.033) 0:01:40.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.032) 0:01:40.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.031) 0:01:40.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.031) 0:01:40.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.032) 0:01:40.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.032) 0:01:40.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.035) 0:01:40.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.033) 0:01:40.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.033) 0:01:40.479 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.063) 0:01:40.543 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.040) 0:01:40.583 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.127) 0:01:40.711 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.038) 0:01:40.749 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 508813, "block_size": 4096, "block_total": 520704, "block_used": 11891, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2084098048, "size_total": 2132803584, "uuid": "6cf3967d-8ed2-4949-ae0f-d16d4c9e0d71" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 508813, "block_size": 4096, "block_total": 520704, "block_used": 11891, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1046525, "inode_total": 1046528, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2084098048, "size_total": 2132803584, "uuid": "6cf3967d-8ed2-4949-ae0f-d16d4c9e0d71" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.043) 0:01:40.792 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.039) 0:01:40.832 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.039) 0:01:40.871 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.041) 0:01:40.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.032) 0:01:40.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.033) 0:01:40.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.031) 0:01:41.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.035) 0:01:41.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.054) 0:01:41.100 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.038) 0:01:41.138 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:45:34 +0000 (0:00:00.038) 0:01:41.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.038) 0:01:41.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.041) 0:01:41.256 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.047) 0:01:41.304 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.041) 0:01:41.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101925.3811214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101925.3811214, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7708, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101925.3811214, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.407) 0:01:41.753 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.040) 0:01:41.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.038) 0:01:41.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.037) 0:01:41.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.033) 0:01:41.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.037) 0:01:41.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.031) 0:01:41.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.032) 0:01:42.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.031) 0:01:42.037 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.040) 0:01:42.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:45:35 +0000 (0:00:00.034) 0:01:42.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.089) 0:01:42.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.034) 0:01:42.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.032) 0:01:42.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.032) 0:01:42.302 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.038) 0:01:42.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.036) 0:01:42.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.034) 0:01:42.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.031) 0:01:42.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.031) 0:01:42.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.032) 0:01:42.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.032) 0:01:42.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.031) 0:01:42.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.033) 0:01:42.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.031) 0:01:42.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.032) 0:01:42.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.032) 0:01:42.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.033) 0:01:42.736 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:45:36 +0000 (0:00:00.388) 0:01:43.124 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.364) 0:01:43.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.040) 0:01:43.530 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.036) 0:01:43.566 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.033) 0:01:43.600 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.034) 0:01:43.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.033) 0:01:43.667 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.032) 0:01:43.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.032) 0:01:43.732 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.038) 0:01:43.771 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.038) 0:01:43.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:45:37 +0000 (0:00:00.042) 0:01:43.852 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.039104", "end": "2022-06-01 12:45:37.502876", "rc": 0, "start": "2022-06-01 12:45:37.463772" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.430) 0:01:44.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid1" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.041) 0:01:44.324 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.042) 0:01:44.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.036) 0:01:44.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.037) 0:01:44.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.035) 0:01:44.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.034) 0:01:44.511 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.032) 0:01:44.543 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.031) 0:01:44.575 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.031) 0:01:44.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:118 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.031) 0:01:44.638 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.087) 0:01:44.726 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:45:38 +0000 (0:00:00.046) 0:01:44.772 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.518) 0:01:45.290 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.079) 0:01:45.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.034) 0:01:45.404 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.031) 0:01:45.435 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.064) 0:01:45.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.028) 0:01:45.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.032) 0:01:45.560 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.040) 0:01:45.601 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.033) 0:01:45.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.032) 0:01:45.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.031) 0:01:45.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.032) 0:01:45.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.031) 0:01:45.762 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.054) 0:01:45.817 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:45:39 +0000 (0:00:00.034) 0:01:45.851 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:45:42 +0000 (0:00:02.587) 0:01:48.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:45:42 +0000 (0:00:00.034) 0:01:48.473 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:45:42 +0000 (0:00:00.031) 0:01:48.504 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:45:42 +0000 (0:00:00.048) 0:01:48.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:45:42 +0000 (0:00:00.040) 0:01:48.593 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:45:42 +0000 (0:00:00.037) 0:01:48.631 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:45:42 +0000 (0:00:00.031) 0:01:48.663 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:45:43 +0000 (0:00:00.707) 0:01:49.370 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:45:43 +0000 (0:00:00.401) 0:01:49.772 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:45:44 +0000 (0:00:00.666) 0:01:50.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:45:44 +0000 (0:00:00.374) 0:01:50.812 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:45:44 +0000 (0:00:00.033) 0:01:50.845 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:134 Wednesday 01 June 2022 16:45:45 +0000 (0:00:00.906) 0:01:51.752 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:45:45 +0000 (0:00:00.118) 0:01:51.870 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:45:45 +0000 (0:00:00.043) 0:01:51.914 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:45:45 +0000 (0:00:00.031) 0:01:51.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "6cf3967d-8ed2-4949-ae0f-d16d4c9e0d71" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "2G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_0", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rmeta_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rmeta_1", "size": "4M", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "CjZKig-XXjg-vbHY-iQkb-ISMr-fcF0-fG4BpY" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "IL9YJH-Ryd1-TjQH-mgnE-hLh5-ebs9-SUbkyq" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:45:46 +0000 (0:00:00.412) 0:01:52.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002563", "end": "2022-06-01 12:45:45.962448", "rc": 0, "start": "2022-06-01 12:45:45.959885" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:45:46 +0000 (0:00:00.378) 0:01:52.737 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002768", "end": "2022-06-01 12:45:46.352277", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:45:46.349509" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:45:46 +0000 (0:00:00.390) 0:01:53.127 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:45:47 +0000 (0:00:00.070) 0:01:53.198 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:45:47 +0000 (0:00:00.034) 0:01:53.232 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:45:47 +0000 (0:00:00.066) 0:01:53.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:45:47 +0000 (0:00:00.043) 0:01:53.342 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:45:47 +0000 (0:00:00.734) 0:01:54.076 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:45:47 +0000 (0:00:00.053) 0:01:54.130 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:45:47 +0000 (0:00:00.041) 0:01:54.171 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.037) 0:01:54.209 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.037) 0:01:54.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.031) 0:01:54.277 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.054) 0:01:54.332 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.060) 0:01:54.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.031) 0:01:54.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.031) 0:01:54.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.031) 0:01:54.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.033) 0:01:54.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.032) 0:01:54.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.031) 0:01:54.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.031) 0:01:54.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.032) 0:01:54.647 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.063) 0:01:54.711 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.064) 0:01:54.776 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid1", "vg1" ], "delta": "0:00:00.031698", "end": "2022-06-01 12:45:48.408469", "rc": 0, "start": "2022-06-01 12:45:48.376771" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:45:48 +0000 (0:00:00.405) 0:01:55.181 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.045) 0:01:55.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.039) 0:01:55.267 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.061) 0:01:55.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.035) 0:01:55.364 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.043) 0:01:55.407 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.076) 0:01:55.483 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.036) 0:01:55.520 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.037) 0:01:55.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.033) 0:01:55.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.031) 0:01:55.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.030) 0:01:55.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.031) 0:01:55.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.036) 0:01:55.722 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.036) 0:01:55.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.034) 0:01:55.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.031) 0:01:55.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.031) 0:01:55.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.032) 0:01:55.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.031) 0:01:55.920 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.064) 0:01:55.984 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.068) 0:01:56.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.032) 0:01:56.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.034) 0:01:56.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:45:49 +0000 (0:00:00.033) 0:01:56.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.033) 0:01:56.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.033) 0:01:56.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.031) 0:01:56.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.032) 0:01:56.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.036) 0:01:56.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.033) 0:01:56.355 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.064) 0:01:56.419 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.037) 0:01:56.456 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.129) 0:01:56.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.036) 0:01:56.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509814, "block_size": 4096, "block_total": 521728, "block_used": 11914, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088198144, "size_total": 2136997888, "uuid": "6cf3967d-8ed2-4949-ae0f-d16d4c9e0d71" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509814, "block_size": 4096, "block_total": 521728, "block_used": 11914, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088198144, "size_total": 2136997888, "uuid": "6cf3967d-8ed2-4949-ae0f-d16d4c9e0d71" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.043) 0:01:56.666 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.052) 0:01:56.718 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.041) 0:01:56.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.042) 0:01:56.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.033) 0:01:56.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.033) 0:01:56.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.032) 0:01:56.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.033) 0:01:56.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.049) 0:01:56.985 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.098) 0:01:57.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.041) 0:01:57.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:45:50 +0000 (0:00:00.033) 0:01:57.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.034) 0:01:57.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.039) 0:01:57.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.041) 0:01:57.275 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101941.6061215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101941.6061215, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 7708, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101941.6061215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.382) 0:01:57.657 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.040) 0:01:57.698 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.039) 0:01:57.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.035) 0:01:57.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.034) 0:01:57.808 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.036) 0:01:57.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.037) 0:01:57.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.034) 0:01:57.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.031) 0:01:57.947 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.039) 0:01:57.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.031) 0:01:58.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.032) 0:01:58.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.036) 0:01:58.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.033) 0:01:58.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:45:51 +0000 (0:00:00.033) 0:01:58.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.041) 0:01:58.196 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.037) 0:01:58.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.032) 0:01:58.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.034) 0:01:58.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.031) 0:01:58.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.031) 0:01:58.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.032) 0:01:58.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.032) 0:01:58.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.031) 0:01:58.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.033) 0:01:58.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.031) 0:01:58.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.031) 0:01:58.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.031) 0:01:58.587 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:45:52 +0000 (0:00:00.396) 0:01:58.983 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.373) 0:01:59.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.041) 0:01:59.397 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.035) 0:01:59.433 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.033) 0:01:59.466 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.032) 0:01:59.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.031) 0:01:59.531 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.031) 0:01:59.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.031) 0:01:59.593 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.034) 0:01:59.628 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.035) 0:01:59.664 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.039) 0:01:59.704 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.037772", "end": "2022-06-01 12:45:53.347711", "rc": 0, "start": "2022-06-01 12:45:53.309939" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:45:53 +0000 (0:00:00.459) 0:02:00.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid1" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.042) 0:02:00.206 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.043) 0:02:00.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.034) 0:02:00.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.040) 0:02:00.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.035) 0:02:00.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.035) 0:02:00.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.032) 0:02:00.429 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.044) 0:02:00.473 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.033) 0:02:00.507 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the device created above] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:136 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.039) 0:02:00.547 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.106) 0:02:00.653 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:45:54 +0000 (0:00:00.047) 0:02:00.701 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.505) 0:02:01.207 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.077) 0:02:01.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.032) 0:02:01.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.033) 0:02:01.351 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.066) 0:02:01.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.031) 0:02:01.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.035) 0:02:01.484 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.042) 0:02:01.526 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.036) 0:02:01.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.032) 0:02:01.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.035) 0:02:01.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.033) 0:02:01.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.033) 0:02:01.697 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.048) 0:02:01.746 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:45:55 +0000 (0:00:00.030) 0:02:01.776 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:45:58 +0000 (0:00:03.002) 0:02:04.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:45:58 +0000 (0:00:00.033) 0:02:04.813 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:45:58 +0000 (0:00:00.030) 0:02:04.843 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:45:58 +0000 (0:00:00.044) 0:02:04.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:45:58 +0000 (0:00:00.039) 0:02:04.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:45:58 +0000 (0:00:00.035) 0:02:04.963 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:45:59 +0000 (0:00:00.392) 0:02:05.355 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:45:59 +0000 (0:00:00.662) 0:02:06.018 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:45:59 +0000 (0:00:00.033) 0:02:06.051 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:46:00 +0000 (0:00:00.667) 0:02:06.719 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:46:00 +0000 (0:00:00.361) 0:02:07.080 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:46:00 +0000 (0:00:00.031) 0:02:07.112 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:152 Wednesday 01 June 2022 16:46:01 +0000 (0:00:00.844) 0:02:07.956 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:46:01 +0000 (0:00:00.072) 0:02:08.029 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid1", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:46:01 +0000 (0:00:00.039) 0:02:08.068 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:46:01 +0000 (0:00:00.044) 0:02:08.113 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:46:02 +0000 (0:00:00.391) 0:02:08.504 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003490", "end": "2022-06-01 12:46:02.123164", "rc": 0, "start": "2022-06-01 12:46:02.119674" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:46:02 +0000 (0:00:00.400) 0:02:08.904 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002832", "end": "2022-06-01 12:46:02.515981", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:46:02.513149" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.381) 0:02:09.286 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.062) 0:02:09.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.030) 0:02:09.379 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.062) 0:02:09.441 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.039) 0:02:09.480 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.029) 0:02:09.510 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.028) 0:02:09.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.039) 0:02:09.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.036) 0:02:09.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.035) 0:02:09.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.028) 0:02:09.678 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.029) 0:02:09.707 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.056) 0:02:09.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.031) 0:02:09.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.031) 0:02:09.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.033) 0:02:09.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.030) 0:02:09.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.032) 0:02:09.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.029) 0:02:09.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.028) 0:02:09.980 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.028) 0:02:10.009 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.057) 0:02:10.066 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.057) 0:02:10.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:46:03 +0000 (0:00:00.030) 0:02:10.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.035) 0:02:10.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.033) 0:02:10.224 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.061) 0:02:10.286 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.035) 0:02:10.321 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.027) 0:02:10.349 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.029) 0:02:10.378 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.030) 0:02:10.409 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.059) 0:02:10.469 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.071) 0:02:10.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.031) 0:02:10.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.030) 0:02:10.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.028) 0:02:10.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.028) 0:02:10.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.030) 0:02:10.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.028) 0:02:10.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.028) 0:02:10.746 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.028) 0:02:10.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.028) 0:02:10.803 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.057) 0:02:10.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.033) 0:02:10.893 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.167) 0:02:11.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.034) 0:02:11.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.036) 0:02:11.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:46:04 +0000 (0:00:00.031) 0:02:11.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.036) 0:02:11.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.032) 0:02:11.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.027) 0:02:11.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.027) 0:02:11.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.027) 0:02:11.315 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.028) 0:02:11.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.046) 0:02:11.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.031) 0:02:11.423 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.043) 0:02:11.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.031) 0:02:11.498 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.029) 0:02:11.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.028) 0:02:11.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.027) 0:02:11.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.371) 0:02:11.955 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.036) 0:02:11.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.027) 0:02:12.018 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.034) 0:02:12.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.030) 0:02:12.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.026) 0:02:12.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.029) 0:02:12.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:46:05 +0000 (0:00:00.033) 0:02:12.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.032) 0:02:12.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.026) 0:02:12.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.029) 0:02:12.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.077) 0:02:12.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.075) 0:02:12.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.036) 0:02:12.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.033) 0:02:12.484 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.039) 0:02:12.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.035) 0:02:12.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.030) 0:02:12.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.029) 0:02:12.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.031) 0:02:12.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.035) 0:02:12.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.032) 0:02:12.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.032) 0:02:12.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.031) 0:02:12.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.035) 0:02:12.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.052) 0:02:12.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.035) 0:02:12.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.030) 0:02:12.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.031) 0:02:12.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.031) 0:02:13.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.031) 0:02:13.033 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.034) 0:02:13.067 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.031) 0:02:13.098 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.028) 0:02:13.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.028) 0:02:13.156 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:46:06 +0000 (0:00:00.027) 0:02:13.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.027) 0:02:13.211 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.033) 0:02:13.245 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.036) 0:02:13.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.029) 0:02:13.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.028) 0:02:13.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.030) 0:02:13.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.027) 0:02:13.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.030) 0:02:13.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.031) 0:02:13.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.027) 0:02:13.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.030) 0:02:13.517 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.028) 0:02:13.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.030) 0:02:13.577 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.026) 0:02:13.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create a RAID0 lvm raid device] ****************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:154 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.075) 0:02:13.679 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.104) 0:02:13.784 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:46:07 +0000 (0:00:00.046) 0:02:13.830 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.515) 0:02:14.346 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.070) 0:02:14.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.031) 0:02:14.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.028) 0:02:14.477 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.061) 0:02:14.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.025) 0:02:14.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.030) 0:02:14.594 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.038) 0:02:14.632 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.031) 0:02:14.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.032) 0:02:14.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.031) 0:02:14.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.029) 0:02:14.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.034) 0:02:14.792 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.056) 0:02:14.848 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:46:08 +0000 (0:00:00.033) 0:02:14.882 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:46:11 +0000 (0:00:02.832) 0:02:17.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:46:11 +0000 (0:00:00.033) 0:02:17.747 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:46:11 +0000 (0:00:00.031) 0:02:17.779 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:46:11 +0000 (0:00:00.043) 0:02:17.823 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:46:11 +0000 (0:00:00.037) 0:02:17.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:46:11 +0000 (0:00:00.038) 0:02:17.899 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:46:11 +0000 (0:00:00.030) 0:02:17.929 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:46:12 +0000 (0:00:00.644) 0:02:18.573 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:46:12 +0000 (0:00:00.429) 0:02:19.002 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:46:13 +0000 (0:00:00.698) 0:02:19.701 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:46:13 +0000 (0:00:00.350) 0:02:20.052 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:46:13 +0000 (0:00:00.029) 0:02:20.082 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:170 Wednesday 01 June 2022 16:46:14 +0000 (0:00:00.847) 0:02:20.929 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:46:14 +0000 (0:00:00.080) 0:02:21.009 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:46:14 +0000 (0:00:00.042) 0:02:21.052 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:46:14 +0000 (0:00:00.031) 0:02:21.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "1a804ed0-b34e-4e3d-b168-a0dcc8fe1249" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "M9fSs3-Ape2-MeNV-tnMC-EsDz-YoEy-afMlgf" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "t3Q508-gIvA-ni3E-PLP3-xKLW-qvfi-mNKkqC" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:46:15 +0000 (0:00:00.393) 0:02:21.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002627", "end": "2022-06-01 12:46:15.080686", "rc": 0, "start": "2022-06-01 12:46:15.078059" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:46:15 +0000 (0:00:00.378) 0:02:21.855 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003218", "end": "2022-06-01 12:46:15.460600", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:46:15.457382" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:46:16 +0000 (0:00:00.381) 0:02:22.237 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:46:16 +0000 (0:00:00.066) 0:02:22.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:46:16 +0000 (0:00:00.033) 0:02:22.337 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:46:16 +0000 (0:00:00.073) 0:02:22.410 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:46:16 +0000 (0:00:00.044) 0:02:22.455 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.731) 0:02:23.186 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.052) 0:02:23.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.040) 0:02:23.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.038) 0:02:23.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.037) 0:02:23.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.029) 0:02:23.384 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.053) 0:02:23.438 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.117) 0:02:23.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.034) 0:02:23.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.032) 0:02:23.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.031) 0:02:23.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.030) 0:02:23.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.031) 0:02:23.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.034) 0:02:23.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.034) 0:02:23.786 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.034) 0:02:23.820 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.061) 0:02:23.882 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:46:17 +0000 (0:00:00.068) 0:02:23.951 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid0", "vg1" ], "delta": "0:00:00.038542", "end": "2022-06-01 12:46:17.610477", "rc": 0, "start": "2022-06-01 12:46:17.571935" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.435) 0:02:24.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.044) 0:02:24.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.042) 0:02:24.473 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.066) 0:02:24.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.036) 0:02:24.577 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.041) 0:02:24.618 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.070) 0:02:24.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.035) 0:02:24.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.036) 0:02:24.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.030) 0:02:24.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.030) 0:02:24.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.031) 0:02:24.854 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.035) 0:02:24.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.036) 0:02:24.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.038) 0:02:24.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.031) 0:02:24.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.030) 0:02:25.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.030) 0:02:25.056 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.034) 0:02:25.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:46:18 +0000 (0:00:00.032) 0:02:25.124 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.063) 0:02:25.187 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.069) 0:02:25.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.032) 0:02:25.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.033) 0:02:25.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.037) 0:02:25.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.041) 0:02:25.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.039) 0:02:25.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.030) 0:02:25.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.030) 0:02:25.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.032) 0:02:25.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.030) 0:02:25.565 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.075) 0:02:25.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.038) 0:02:25.678 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.193) 0:02:25.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.046) 0:02:25.918 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "1a804ed0-b34e-4e3d-b168-a0dcc8fe1249" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "1a804ed0-b34e-4e3d-b168-a0dcc8fe1249" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.044) 0:02:25.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.043) 0:02:26.006 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.036) 0:02:26.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.037) 0:02:26.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.031) 0:02:26.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.032) 0:02:26.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:46:19 +0000 (0:00:00.031) 0:02:26.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.037) 0:02:26.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.049) 0:02:26.263 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.036) 0:02:26.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.038) 0:02:26.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.031) 0:02:26.369 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.035) 0:02:26.404 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.039) 0:02:26.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.038) 0:02:26.482 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101970.8731215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101970.8731215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 8179, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101970.8731215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.402) 0:02:26.884 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.038) 0:02:26.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.038) 0:02:26.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.035) 0:02:26.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.034) 0:02:27.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.036) 0:02:27.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.030) 0:02:27.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.030) 0:02:27.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:46:20 +0000 (0:00:00.031) 0:02:27.162 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.038) 0:02:27.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.035) 0:02:27.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.032) 0:02:27.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.032) 0:02:27.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.031) 0:02:27.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.032) 0:02:27.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.039) 0:02:27.404 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.038) 0:02:27.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.031) 0:02:27.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.031) 0:02:27.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.031) 0:02:27.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.038) 0:02:27.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.036) 0:02:27.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.038) 0:02:27.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.033) 0:02:27.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.033) 0:02:27.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.034) 0:02:27.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.032) 0:02:27.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:46:21 +0000 (0:00:00.033) 0:02:27.817 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.387) 0:02:28.205 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.377) 0:02:28.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.041) 0:02:28.624 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.091) 0:02:28.716 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.038) 0:02:28.754 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.033) 0:02:28.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.033) 0:02:28.822 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.035) 0:02:28.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.032) 0:02:28.889 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.041) 0:02:28.930 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.035) 0:02:28.966 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:46:22 +0000 (0:00:00.040) 0:02:29.007 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.034618", "end": "2022-06-01 12:46:22.661641", "rc": 0, "start": "2022-06-01 12:46:22.627023" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid0 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.441) 0:02:29.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid0" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.042) 0:02:29.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.041) 0:02:29.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.036) 0:02:29.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.033) 0:02:29.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.032) 0:02:29.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.032) 0:02:29.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.031) 0:02:29.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.032) 0:02:29.732 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.032) 0:02:29.765 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:172 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.032) 0:02:29.797 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.112) 0:02:29.910 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:46:23 +0000 (0:00:00.046) 0:02:29.957 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.533) 0:02:30.491 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.081) 0:02:30.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.035) 0:02:30.608 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.033) 0:02:30.641 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.067) 0:02:30.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.027) 0:02:30.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.032) 0:02:30.769 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.038) 0:02:30.808 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.033) 0:02:30.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.034) 0:02:30.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.034) 0:02:30.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.032) 0:02:30.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.031) 0:02:30.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.048) 0:02:31.022 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:46:24 +0000 (0:00:00.029) 0:02:31.051 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:46:26 +0000 (0:00:01.814) 0:02:32.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:46:26 +0000 (0:00:00.034) 0:02:32.901 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:46:26 +0000 (0:00:00.031) 0:02:32.933 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:46:26 +0000 (0:00:00.044) 0:02:32.977 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:46:26 +0000 (0:00:00.039) 0:02:33.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:46:26 +0000 (0:00:00.037) 0:02:33.054 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:46:26 +0000 (0:00:00.030) 0:02:33.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:46:27 +0000 (0:00:00.689) 0:02:33.774 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:46:28 +0000 (0:00:00.418) 0:02:34.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:46:28 +0000 (0:00:00.646) 0:02:34.840 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:46:29 +0000 (0:00:00.373) 0:02:35.213 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:46:29 +0000 (0:00:00.032) 0:02:35.246 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:188 Wednesday 01 June 2022 16:46:29 +0000 (0:00:00.857) 0:02:36.104 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:46:30 +0000 (0:00:00.086) 0:02:36.190 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:46:30 +0000 (0:00:00.042) 0:02:36.233 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:46:30 +0000 (0:00:00.031) 0:02:36.264 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "1a804ed0-b34e-4e3d-b168-a0dcc8fe1249" }, "/dev/mapper/vg1-lv1_rimage_0": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_0", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-lv1_rimage_1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-lv1_rimage_1", "size": "1G", "type": "lvm", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "M9fSs3-Ape2-MeNV-tnMC-EsDz-YoEy-afMlgf" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "t3Q508-gIvA-ni3E-PLP3-xKLW-qvfi-mNKkqC" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:46:30 +0000 (0:00:00.373) 0:02:36.637 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002520", "end": "2022-06-01 12:46:30.237262", "rc": 0, "start": "2022-06-01 12:46:30.234742" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:46:30 +0000 (0:00:00.373) 0:02:37.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003049", "end": "2022-06-01 12:46:30.608850", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:46:30.605801" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:46:31 +0000 (0:00:00.374) 0:02:37.385 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:46:31 +0000 (0:00:00.137) 0:02:37.523 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:46:31 +0000 (0:00:00.033) 0:02:37.556 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:46:31 +0000 (0:00:00.065) 0:02:37.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb1", "/dev/sda1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:46:31 +0000 (0:00:00.044) 0:02:37.666 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.765) 0:02:38.432 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb1" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb1", "/dev/sda1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda1" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.054) 0:02:38.487 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.041) 0:02:38.529 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.158) 0:02:38.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.045) 0:02:38.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.038) 0:02:38.771 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.070) 0:02:38.842 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.064) 0:02:38.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.031) 0:02:38.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.029) 0:02:38.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.029) 0:02:38.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.035) 0:02:39.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.034) 0:02:39.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.041) 0:02:39.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.036) 0:02:39.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:46:32 +0000 (0:00:00.032) 0:02:39.178 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.076) 0:02:39.255 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.068) 0:02:39.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "lv_name", "--select", "lv_name=lv1&&lv_layout=raid0", "vg1" ], "delta": "0:00:00.029994", "end": "2022-06-01 12:46:33.081597", "rc": 0, "start": "2022-06-01 12:46:33.051603" } STDOUT: lv1 TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.544) 0:02:39.868 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.051) 0:02:39.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.045) 0:02:39.965 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.066) 0:02:40.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.037) 0:02:40.069 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:46:33 +0000 (0:00:00.042) 0:02:40.112 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.072) 0:02:40.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.039) 0:02:40.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.036) 0:02:40.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.030) 0:02:40.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.030) 0:02:40.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.031) 0:02:40.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.031) 0:02:40.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.095) 0:02:40.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.039) 0:02:40.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.033) 0:02:40.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.033) 0:02:40.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.032) 0:02:40.619 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.035) 0:02:40.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.032) 0:02:40.688 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.064) 0:02:40.752 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.069) 0:02:40.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.033) 0:02:40.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.031) 0:02:40.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.032) 0:02:40.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.032) 0:02:40.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.034) 0:02:40.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.031) 0:02:41.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.031) 0:02:41.049 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.033) 0:02:41.083 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.033) 0:02:41.116 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:46:34 +0000 (0:00:00.067) 0:02:41.184 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.054) 0:02:41.238 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.129) 0:02:41.368 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.040) 0:02:41.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "1a804ed0-b34e-4e3d-b168-a0dcc8fe1249" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509649, "block_size": 4096, "block_total": 521600, "block_used": 11951, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048317, "inode_total": 1048320, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=256,noquota", "size_available": 2087522304, "size_total": 2136473600, "uuid": "1a804ed0-b34e-4e3d-b168-a0dcc8fe1249" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.052) 0:02:41.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.041) 0:02:41.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.039) 0:02:41.541 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.039) 0:02:41.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.032) 0:02:41.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.033) 0:02:41.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.035) 0:02:41.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.033) 0:02:41.717 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.048) 0:02:41.765 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.038) 0:02:41.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.039) 0:02:41.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.032) 0:02:41.875 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.033) 0:02:41.909 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.038) 0:02:41.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:46:35 +0000 (0:00:00.038) 0:02:41.986 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654101970.8731215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654101970.8731215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 8179, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654101970.8731215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.402) 0:02:42.389 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.039) 0:02:42.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.038) 0:02:42.467 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.035) 0:02:42.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.033) 0:02:42.536 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.036) 0:02:42.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:42.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:42.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:42.669 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.038) 0:02:42.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.096) 0:02:42.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:42.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.033) 0:02:42.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:42.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:42.934 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.042) 0:02:42.977 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.043) 0:02:43.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.030) 0:02:43.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:43.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:43.116 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.033) 0:02:43.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:46:36 +0000 (0:00:00.032) 0:02:43.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.034) 0:02:43.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.033) 0:02:43.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.032) 0:02:43.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.037) 0:02:43.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.038) 0:02:43.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.035) 0:02:43.393 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.374) 0:02:43.768 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:46:37 +0000 (0:00:00.396) 0:02:44.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.044) 0:02:44.208 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.038) 0:02:44.246 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.034) 0:02:44.281 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.032) 0:02:44.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.033) 0:02:44.347 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.033) 0:02:44.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.036) 0:02:44.418 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.037) 0:02:44.456 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.034) 0:02:44.490 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.041) 0:02:44.531 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.036316", "end": "2022-06-01 12:46:38.168597", "rc": 0, "start": "2022-06-01 12:46:38.132281" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=rwi-aor--- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=raid0 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.412) 0:02:44.944 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "raid0" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.041) 0:02:44.985 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.043) 0:02:45.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.047) 0:02:45.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.035) 0:02:45.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.034) 0:02:45.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:46:38 +0000 (0:00:00.034) 0:02:45.181 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.032) 0:02:45.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.033) 0:02:45.248 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.033) 0:02:45.281 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the device created above] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:190 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.032) 0:02:45.314 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.125) 0:02:45.440 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.047) 0:02:45.487 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.512) 0:02:45.999 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.074) 0:02:46.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.034) 0:02:46.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:46:39 +0000 (0:00:00.037) 0:02:46.146 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.073) 0:02:46.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.029) 0:02:46.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.031) 0:02:46.280 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.037) 0:02:46.318 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.031) 0:02:46.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.029) 0:02:46.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.031) 0:02:46.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.030) 0:02:46.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.033) 0:02:46.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.046) 0:02:46.522 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:46:40 +0000 (0:00:00.030) 0:02:46.552 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:46:43 +0000 (0:00:02.843) 0:02:49.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:46:43 +0000 (0:00:00.035) 0:02:49.432 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:46:43 +0000 (0:00:00.031) 0:02:49.463 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:46:43 +0000 (0:00:00.043) 0:02:49.507 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:46:43 +0000 (0:00:00.041) 0:02:49.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:46:43 +0000 (0:00:00.037) 0:02:49.585 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:46:43 +0000 (0:00:00.384) 0:02:49.969 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:46:44 +0000 (0:00:00.670) 0:02:50.639 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:46:44 +0000 (0:00:00.034) 0:02:50.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:46:45 +0000 (0:00:00.652) 0:02:51.326 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:46:45 +0000 (0:00:00.403) 0:02:51.730 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:46:45 +0000 (0:00:00.032) 0:02:51.762 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml:206 Wednesday 01 June 2022 16:46:46 +0000 (0:00:00.835) 0:02:52.598 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:46:46 +0000 (0:00:00.087) 0:02:52.685 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [ "sda", "sdb" ], "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:46:46 +0000 (0:00:00.039) 0:02:52.725 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:46:46 +0000 (0:00:00.080) 0:02:52.806 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:46:47 +0000 (0:00:00.403) 0:02:53.209 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002883", "end": "2022-06-01 12:46:46.819077", "rc": 0, "start": "2022-06-01 12:46:46.816194" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:46:47 +0000 (0:00:00.384) 0:02:53.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003435", "end": "2022-06-01 12:46:47.198043", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:46:47.194608" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:46:47 +0000 (0:00:00.379) 0:02:53.973 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:46:47 +0000 (0:00:00.070) 0:02:54.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:46:47 +0000 (0:00:00.039) 0:02:54.083 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:46:47 +0000 (0:00:00.063) 0:02:54.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.042) 0:02:54.189 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.028) 0:02:54.218 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.028) 0:02:54.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.037) 0:02:54.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.036) 0:02:54.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.037) 0:02:54.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.033) 0:02:54.391 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.027) 0:02:54.419 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.055) 0:02:54.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.030) 0:02:54.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.030) 0:02:54.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.033) 0:02:54.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.029) 0:02:54.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.031) 0:02:54.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.031) 0:02:54.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.030) 0:02:54.693 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.029) 0:02:54.723 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.060) 0:02:54.783 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.064) 0:02:54.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.033) 0:02:54.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.033) 0:02:54.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.032) 0:02:54.948 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.060) 0:02:55.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.035) 0:02:55.044 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.032) 0:02:55.076 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.030) 0:02:55.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:46:48 +0000 (0:00:00.032) 0:02:55.139 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.060) 0:02:55.200 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.066) 0:02:55.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.031) 0:02:55.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.030) 0:02:55.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.033) 0:02:55.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.036) 0:02:55.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.085) 0:02:55.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.031) 0:02:55.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.030) 0:02:55.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.030) 0:02:55.577 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.030) 0:02:55.608 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.065) 0:02:55.674 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.037) 0:02:55.711 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.124) 0:02:55.836 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.035) 0:02:55.871 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.039) 0:02:55.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.030) 0:02:55.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.036) 0:02:55.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.029) 0:02:56.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.031) 0:02:56.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.032) 0:02:56.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.034) 0:02:56.105 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:46:49 +0000 (0:00:00.033) 0:02:56.139 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.050) 0:02:56.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.028) 0:02:56.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.035) 0:02:56.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.028) 0:02:56.281 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.031) 0:02:56.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.030) 0:02:56.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.028) 0:02:56.371 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.373) 0:02:56.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.037) 0:02:56.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.025) 0:02:56.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.034) 0:02:56.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.030) 0:02:56.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.024) 0:02:56.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.030) 0:02:56.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.032) 0:02:56.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.029) 0:02:56.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.026) 0:02:57.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.029) 0:02:57.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.029) 0:02:57.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.029) 0:02:57.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.033) 0:02:57.141 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:46:50 +0000 (0:00:00.029) 0:02:57.171 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.043) 0:02:57.215 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.035) 0:02:57.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.030) 0:02:57.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.031) 0:02:57.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.032) 0:02:57.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.031) 0:02:57.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.031) 0:02:57.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.031) 0:02:57.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.031) 0:02:57.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.031) 0:02:57.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.032) 0:02:57.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.029) 0:02:57.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.031) 0:02:57.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.030) 0:02:57.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.030) 0:02:57.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.030) 0:02:57.689 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.078) 0:02:57.767 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.032) 0:02:57.800 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.032) 0:02:57.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.032) 0:02:57.865 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.032) 0:02:57.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.032) 0:02:57.930 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.039) 0:02:57.969 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.033) 0:02:58.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.029) 0:02:58.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.027) 0:02:58.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.027) 0:02:58.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.030) 0:02:58.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.034) 0:02:58.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:46:51 +0000 (0:00:00.029) 0:02:58.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:46:52 +0000 (0:00:00.031) 0:02:58.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:46:52 +0000 (0:00:00.030) 0:02:58.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:46:52 +0000 (0:00:00.028) 0:02:58.274 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:46:52 +0000 (0:00:00.028) 0:02:58.302 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:46:52 +0000 (0:00:00.028) 0:02:58.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1082 changed=14 unreachable=0 failed=0 skipped=932 rescued=0 ignored=0 Wednesday 01 June 2022 16:46:52 +0000 (0:00:00.014) 0:02:58.345 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.47s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.96s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : set up new/current mounts ------------------ 1.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.12s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : remove obsolete mounts --------------------- 1.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 linux-system-roles.storage : set up new/current mounts ------------------ 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove_scsi_generated.yml:3 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:46:52 +0000 (0:00:00.025) 0:00:00.025 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:46:54 +0000 (0:00:01.267) 0:00:01.293 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_raid_volume_then_remove.yml ***************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:2 Wednesday 01 June 2022 16:46:54 +0000 (0:00:00.012) 0:00:01.305 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:10 Wednesday 01 June 2022 16:46:55 +0000 (0:00:01.060) 0:00:02.366 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:46:55 +0000 (0:00:00.038) 0:00:02.404 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:46:55 +0000 (0:00:00.150) 0:00:02.555 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:46:55 +0000 (0:00:00.505) 0:00:03.060 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:46:56 +0000 (0:00:00.076) 0:00:03.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:46:56 +0000 (0:00:00.022) 0:00:03.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:46:56 +0000 (0:00:00.023) 0:00:03.183 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:46:56 +0000 (0:00:00.188) 0:00:03.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:46:56 +0000 (0:00:00.018) 0:00:03.390 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:46:57 +0000 (0:00:01.067) 0:00:04.458 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:46:57 +0000 (0:00:00.045) 0:00:04.503 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:46:57 +0000 (0:00:00.043) 0:00:04.547 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:46:58 +0000 (0:00:00.690) 0:00:05.238 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:46:58 +0000 (0:00:00.081) 0:00:05.320 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:46:58 +0000 (0:00:00.020) 0:00:05.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:46:58 +0000 (0:00:00.021) 0:00:05.361 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:46:58 +0000 (0:00:00.019) 0:00:05.381 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:46:59 +0000 (0:00:00.802) 0:00:06.184 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:47:00 +0000 (0:00:01.780) 0:00:07.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:47:00 +0000 (0:00:00.043) 0:00:08.008 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:47:00 +0000 (0:00:00.024) 0:00:08.033 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.512) 0:00:08.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.030) 0:00:08.576 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.025) 0:00:08.602 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.033) 0:00:08.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.030) 0:00:08.666 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.032) 0:00:08.698 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.027) 0:00:08.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.028) 0:00:08.754 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.028) 0:00:08.782 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:47:01 +0000 (0:00:00.031) 0:00:08.814 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:47:02 +0000 (0:00:00.448) 0:00:09.263 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:47:02 +0000 (0:00:00.028) 0:00:09.291 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:13 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.835) 0:00:10.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:20 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.030) 0:00:10.157 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.044) 0:00:10.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.510) 0:00:10.712 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.035) 0:00:10.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.029) 0:00:10.777 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a RAID0 device mounted on "/opt/test1"] *************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:24 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.032) 0:00:10.810 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.053) 0:00:10.863 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:47:03 +0000 (0:00:00.040) 0:00:10.904 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.513) 0:00:11.418 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.069) 0:00:11.487 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.029) 0:00:11.516 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.028) 0:00:11.545 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.056) 0:00:11.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.054) 0:00:11.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.029) 0:00:11.685 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.031) 0:00:11.716 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "raid0", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.033) 0:00:11.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.028) 0:00:11.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.027) 0:00:11.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.030) 0:00:11.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.028) 0:00:11.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.042) 0:00:11.907 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:47:04 +0000 (0:00:00.026) 0:00:11.934 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "mounted" } ], "packages": [ "mdadm", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:47:12 +0000 (0:00:07.697) 0:00:19.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:47:12 +0000 (0:00:00.031) 0:00:19.663 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:47:12 +0000 (0:00:00.027) 0:00:19.690 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "mounted" } ], "packages": [ "mdadm", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:47:12 +0000 (0:00:00.041) 0:00:19.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:47:12 +0000 (0:00:00.037) 0:00:19.769 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:47:12 +0000 (0:00:00.037) 0:00:19.806 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:47:12 +0000 (0:00:00.029) 0:00:19.835 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:47:13 +0000 (0:00:00.972) 0:00:20.808 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f3e4ae78-ae70-4963-8165-a1f573256037', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:47:14 +0000 (0:00:00.543) 0:00:21.352 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:47:14 +0000 (0:00:00.652) 0:00:22.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:47:15 +0000 (0:00:00.365) 0:00:22.370 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:47:15 +0000 (0:00:00.029) 0:00:22.399 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:36 Wednesday 01 June 2022 16:47:16 +0000 (0:00:00.831) 0:00:23.230 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:47:16 +0000 (0:00:00.051) 0:00:23.281 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:47:16 +0000 (0:00:00.030) 0:00:23.312 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:47:16 +0000 (0:00:00.039) 0:00:23.351 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "20G", "type": "raid0", "uuid": "f3e4ae78-ae70-4963-8165-a1f573256037" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "3ab8ca3c-df94-0582-f8aa-96d2f97aa959" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "3ab8ca3c-df94-0582-f8aa-96d2f97aa959" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:47:16 +0000 (0:00:00.542) 0:00:23.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002293", "end": "2022-06-01 12:47:16.673193", "rc": 0, "start": "2022-06-01 12:47:16.670900" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=f3e4ae78-ae70-4963-8165-a1f573256037 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.452) 0:00:24.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002423", "end": "2022-06-01 12:47:17.025483", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:47:17.023060" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.352) 0:00:24.698 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.028) 0:00:24.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.031) 0:00:24.757 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.059) 0:00:24.817 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.034) 0:00:24.851 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.121) 0:00:24.973 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/md127" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.035) 0:00:25.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "f3e4ae78-ae70-4963-8165-a1f573256037" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "f3e4ae78-ae70-4963-8165-a1f573256037" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.040) 0:00:25.049 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:47:17 +0000 (0:00:00.036) 0:00:25.085 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.035) 0:00:25.120 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.035) 0:00:25.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.034) 0:00:25.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.033) 0:00:25.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.030) 0:00:25.255 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.031) 0:00:25.286 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f3e4ae78-ae70-4963-8165-a1f573256037 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.055) 0:00:25.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.036) 0:00:25.378 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.041) 0:00:25.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.030) 0:00:25.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.030) 0:00:25.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.038) 0:00:25.519 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.036) 0:00:25.555 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102031.8881216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102031.8881216, "dev": 5, "device_type": 2431, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 8574, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102031.8881216, "nlink": 1, "path": "/dev/md/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.378) 0:00:25.934 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.038) 0:00:25.972 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.036) 0:00:26.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.034) 0:00:26.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid0" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:47:18 +0000 (0:00:00.035) 0:00:26.079 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.036) 0:00:26.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.029) 0:00:26.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.029) 0:00:26.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.029) 0:00:26.204 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.036) 0:00:26.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.029) 0:00:26.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.031) 0:00:26.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.029) 0:00:26.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.030) 0:00:26.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.031) 0:00:26.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.036) 0:00:26.430 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.032) 0:00:26.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.030) 0:00:26.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.071) 0:00:26.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.030) 0:00:26.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.029) 0:00:26.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/test1" ], "delta": "0:00:00.006786", "end": "2022-06-01 12:47:19.320576", "rc": 0, "start": "2022-06-01 12:47:19.313790" } STDOUT: /dev/md/test1: Version : 1.2 Creation Time : Wed Jun 1 12:47:06 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:47:06 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : test1 UUID : 3ab8ca3c:df940582:f8aa96d2:f97aa959 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.373) 0:00:26.999 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.039) 0:00:27.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:47:19 +0000 (0:00:00.037) 0:00:27.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ None\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.040) 0:00:27.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.033) 0:00:27.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.032) 0:00:27.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.032) 0:00:27.215 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 21474836480, "changed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.456) 0:00:27.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.029) 0:00:27.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.040) 0:00:27.742 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.036) 0:00:27.778 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.030) 0:00:27.808 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.029) 0:00:27.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.028) 0:00:27.866 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.036) 0:00:27.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.036) 0:00:27.939 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 21474836480, "changed": false, "failed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.036) 0:00:27.975 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.030) 0:00:28.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.027) 0:00:28.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:47:20 +0000 (0:00:00.027) 0:00:28.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.027) 0:00:28.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.030) 0:00:28.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.030) 0:00:28.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.027) 0:00:28.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.027) 0:00:28.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.027) 0:00:28.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.028) 0:00:28.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:38 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.027) 0:00:28.289 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.056) 0:00:28.346 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.042) 0:00:28.388 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.511) 0:00:28.899 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.082) 0:00:28.981 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.030) 0:00:29.012 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:47:21 +0000 (0:00:00.031) 0:00:29.043 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.060) 0:00:29.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.025) 0:00:29.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.030) 0:00:29.159 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.033) 0:00:29.192 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "raid0", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.036) 0:00:29.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.031) 0:00:29.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.069) 0:00:29.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.029) 0:00:29.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.029) 0:00:29.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.044) 0:00:29.434 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:47:22 +0000 (0:00:00.026) 0:00:29.461 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:47:23 +0000 (0:00:01.336) 0:00:30.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:47:23 +0000 (0:00:00.031) 0:00:30.829 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:47:23 +0000 (0:00:00.027) 0:00:30.857 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:47:23 +0000 (0:00:00.036) 0:00:30.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:47:23 +0000 (0:00:00.032) 0:00:30.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:47:23 +0000 (0:00:00.036) 0:00:30.963 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:47:23 +0000 (0:00:00.040) 0:00:31.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:47:24 +0000 (0:00:00.643) 0:00:31.647 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f3e4ae78-ae70-4963-8165-a1f573256037', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:47:24 +0000 (0:00:00.377) 0:00:32.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:47:25 +0000 (0:00:00.635) 0:00:32.660 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:47:25 +0000 (0:00:00.375) 0:00:33.035 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:47:25 +0000 (0:00:00.029) 0:00:33.065 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:50 Wednesday 01 June 2022 16:47:26 +0000 (0:00:00.843) 0:00:33.909 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:47:26 +0000 (0:00:00.056) 0:00:33.965 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:47:26 +0000 (0:00:00.032) 0:00:33.997 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:47:26 +0000 (0:00:00.080) 0:00:34.078 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "20G", "type": "raid0", "uuid": "f3e4ae78-ae70-4963-8165-a1f573256037" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "3ab8ca3c-df94-0582-f8aa-96d2f97aa959" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "3ab8ca3c-df94-0582-f8aa-96d2f97aa959" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:47:27 +0000 (0:00:00.373) 0:00:34.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002826", "end": "2022-06-01 12:47:27.145007", "rc": 0, "start": "2022-06-01 12:47:27.142181" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=f3e4ae78-ae70-4963-8165-a1f573256037 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:47:27 +0000 (0:00:00.372) 0:00:34.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002607", "end": "2022-06-01 12:47:27.509941", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:47:27.507334" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.362) 0:00:35.186 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.030) 0:00:35.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.030) 0:00:35.247 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.060) 0:00:35.308 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.033) 0:00:35.342 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.106) 0:00:35.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/md127" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.033) 0:00:35.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "f3e4ae78-ae70-4963-8165-a1f573256037" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "f3e4ae78-ae70-4963-8165-a1f573256037" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.038) 0:00:35.520 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.042) 0:00:35.563 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.036) 0:00:35.600 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.039) 0:00:35.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.032) 0:00:35.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.033) 0:00:35.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.032) 0:00:35.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.034) 0:00:35.772 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f3e4ae78-ae70-4963-8165-a1f573256037 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.048) 0:00:35.821 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.034) 0:00:35.855 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.035) 0:00:35.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.031) 0:00:35.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.033) 0:00:35.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:47:28 +0000 (0:00:00.087) 0:00:36.044 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.074) 0:00:36.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102031.8881216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102031.8881216, "dev": 5, "device_type": 2431, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 8574, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102031.8881216, "nlink": 1, "path": "/dev/md/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.525) 0:00:36.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.038) 0:00:36.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.035) 0:00:36.719 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.032) 0:00:36.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid0" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.035) 0:00:36.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.035) 0:00:36.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.028) 0:00:36.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.031) 0:00:36.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.028) 0:00:36.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.036) 0:00:36.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.030) 0:00:36.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.033) 0:00:37.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:47:29 +0000 (0:00:00.031) 0:00:37.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.076) 0:00:37.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.031) 0:00:37.150 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.038) 0:00:37.188 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.034) 0:00:37.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.029) 0:00:37.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.029) 0:00:37.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.030) 0:00:37.313 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.029) 0:00:37.343 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/test1" ], "delta": "0:00:00.006313", "end": "2022-06-01 12:47:30.038113", "rc": 0, "start": "2022-06-01 12:47:30.031800" } STDOUT: /dev/md/test1: Version : 1.2 Creation Time : Wed Jun 1 12:47:06 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:47:06 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : test1 UUID : 3ab8ca3c:df940582:f8aa96d2:f97aa959 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.373) 0:00:37.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.039) 0:00:37.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 0\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.038) 0:00:37.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.2\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.036) 0:00:37.831 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.039) 0:00:37.870 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.040) 0:00:37.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:47:30 +0000 (0:00:00.038) 0:00:37.949 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 21474836480, "changed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.370) 0:00:38.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.032) 0:00:38.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.383 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.032) 0:00:38.415 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.029) 0:00:38.445 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.029) 0:00:38.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.506 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.030) 0:00:38.568 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 21474836480, "changed": false, "failed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.034) 0:00:38.602 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.033) 0:00:38.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.028) 0:00:38.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.030) 0:00:38.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.030) 0:00:38.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.031) 0:00:38.882 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.030) 0:00:38.913 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the disk device created above] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:52 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.030) 0:00:38.944 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.072) 0:00:39.016 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:47:31 +0000 (0:00:00.044) 0:00:39.061 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.498) 0:00:39.559 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.082) 0:00:39.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.034) 0:00:39.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.031) 0:00:39.707 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.060) 0:00:39.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.026) 0:00:39.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.032) 0:00:39.827 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.071) 0:00:39.899 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "raid0", "state": "absent", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.037) 0:00:39.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.036) 0:00:39.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.031) 0:00:40.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.031) 0:00:40.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:47:32 +0000 (0:00:00.030) 0:00:40.066 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:47:33 +0000 (0:00:00.045) 0:00:40.112 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:47:33 +0000 (0:00:00.027) 0:00:40.139 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/md/test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/md/test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:47:35 +0000 (0:00:01.986) 0:00:42.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:47:35 +0000 (0:00:00.031) 0:00:42.157 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:47:35 +0000 (0:00:00.027) 0:00:42.184 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/md/test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/md/test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:47:35 +0000 (0:00:00.039) 0:00:42.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:47:35 +0000 (0:00:00.033) 0:00:42.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:47:35 +0000 (0:00:00.036) 0:00:42.294 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f3e4ae78-ae70-4963-8165-a1f573256037', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f3e4ae78-ae70-4963-8165-a1f573256037" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:47:35 +0000 (0:00:00.400) 0:00:42.694 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:47:36 +0000 (0:00:00.677) 0:00:43.372 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:47:36 +0000 (0:00:00.031) 0:00:43.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:47:36 +0000 (0:00:00.678) 0:00:44.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:47:37 +0000 (0:00:00.389) 0:00:44.471 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:47:37 +0000 (0:00:00.029) 0:00:44.501 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:65 Wednesday 01 June 2022 16:47:38 +0000 (0:00:00.836) 0:00:45.337 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:47:38 +0000 (0:00:00.059) 0:00:45.397 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:47:38 +0000 (0:00:00.029) 0:00:45.427 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:47:38 +0000 (0:00:00.037) 0:00:45.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:47:38 +0000 (0:00:00.365) 0:00:45.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002633", "end": "2022-06-01 12:47:38.520834", "rc": 0, "start": "2022-06-01 12:47:38.518201" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.370) 0:00:46.200 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002449", "end": "2022-06-01 12:47:38.882084", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:47:38.879635" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.402) 0:00:46.603 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.029) 0:00:46.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.030) 0:00:46.663 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.059) 0:00:46.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.033) 0:00:46.756 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.106) 0:00:46.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.034) 0:00:46.898 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.042) 0:00:46.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.028) 0:00:46.969 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.034) 0:00:47.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.034) 0:00:47.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:47:39 +0000 (0:00:00.035) 0:00:47.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.030) 0:00:47.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.032) 0:00:47.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.031) 0:00:47.168 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.045) 0:00:47.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.024) 0:00:47.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.035) 0:00:47.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.029) 0:00:47.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.031) 0:00:47.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.028) 0:00:47.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.025) 0:00:47.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.360) 0:00:47.749 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.035) 0:00:47.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.024) 0:00:47.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.031) 0:00:47.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid0" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.035) 0:00:47.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.025) 0:00:47.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.029) 0:00:47.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.028) 0:00:47.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.028) 0:00:47.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.024) 0:00:48.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.032) 0:00:48.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:47:40 +0000 (0:00:00.029) 0:00:48.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.030) 0:00:48.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.031) 0:00:48.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.046) 0:00:48.213 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.036) 0:00:48.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.034) 0:00:48.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.031) 0:00:48.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.028) 0:00:48.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.030) 0:00:48.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.030) 0:00:48.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.030) 0:00:48.671 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.032) 0:00:48.703 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.030) 0:00:48.734 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.032) 0:00:48.796 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.068) 0:00:48.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.031) 0:00:48.895 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.032) 0:00:48.928 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.031) 0:00:48.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:48.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:49.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.031) 0:00:49.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:47:41 +0000 (0:00:00.029) 0:00:49.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:47:42 +0000 (0:00:00.028) 0:00:49.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:47:42 +0000 (0:00:00.028) 0:00:49.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:47:42 +0000 (0:00:00.028) 0:00:49.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:47:42 +0000 (0:00:00.027) 0:00:49.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:47:42 +0000 (0:00:00.030) 0:00:49.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=215 changed=4 unreachable=0 failed=0 skipped=161 rescued=0 ignored=0 Wednesday 01 June 2022 16:47:42 +0000 (0:00:00.015) 0:00:49.238 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.06s /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:2 ------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.97s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : set up new/current mounts ------------------ 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:47:42 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:47:44 +0000 (0:00:01.234) 0:00:01.257 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.23s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_raid_volume_then_remove_nvme_generated.yml ************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:47:44 +0000 (0:00:00.015) 0:00:01.273 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.23s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:47:44 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:47:46 +0000 (0:00:01.252) 0:00:01.275 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_create_raid_volume_then_remove_scsi_generated.yml ************** 2 plays in /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove_scsi_generated.yml:3 Wednesday 01 June 2022 16:47:46 +0000 (0:00:00.014) 0:00:01.289 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove_scsi_generated.yml:7 Wednesday 01 June 2022 16:47:47 +0000 (0:00:01.045) 0:00:02.334 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:2 Wednesday 01 June 2022 16:47:47 +0000 (0:00:00.022) 0:00:02.357 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:10 Wednesday 01 June 2022 16:47:48 +0000 (0:00:00.826) 0:00:03.184 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:47:48 +0000 (0:00:00.038) 0:00:03.222 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:47:48 +0000 (0:00:00.154) 0:00:03.377 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:47:48 +0000 (0:00:00.539) 0:00:03.916 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:47:48 +0000 (0:00:00.074) 0:00:03.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:47:48 +0000 (0:00:00.023) 0:00:04.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:47:48 +0000 (0:00:00.021) 0:00:04.036 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:47:49 +0000 (0:00:00.195) 0:00:04.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:47:49 +0000 (0:00:00.019) 0:00:04.251 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:47:50 +0000 (0:00:01.085) 0:00:05.337 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:47:50 +0000 (0:00:00.044) 0:00:05.382 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:47:50 +0000 (0:00:00.043) 0:00:05.425 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:47:50 +0000 (0:00:00.671) 0:00:06.097 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:47:51 +0000 (0:00:00.078) 0:00:06.175 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:47:51 +0000 (0:00:00.020) 0:00:06.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:47:51 +0000 (0:00:00.023) 0:00:06.220 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:47:51 +0000 (0:00:00.020) 0:00:06.241 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:47:51 +0000 (0:00:00.803) 0:00:07.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:47:53 +0000 (0:00:01.795) 0:00:08.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:47:53 +0000 (0:00:00.043) 0:00:08.884 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:47:53 +0000 (0:00:00.027) 0:00:08.912 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.494) 0:00:09.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.033) 0:00:09.439 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.027) 0:00:09.466 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.032) 0:00:09.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.031) 0:00:09.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.031) 0:00:09.562 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.029) 0:00:09.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.028) 0:00:09.621 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.027) 0:00:09.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:47:54 +0000 (0:00:00.028) 0:00:09.676 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:47:55 +0000 (0:00:00.454) 0:00:10.130 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:47:55 +0000 (0:00:00.026) 0:00:10.157 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:13 Wednesday 01 June 2022 16:47:55 +0000 (0:00:00.828) 0:00:10.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:20 Wednesday 01 June 2022 16:47:55 +0000 (0:00:00.030) 0:00:11.015 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:47:55 +0000 (0:00:00.043) 0:00:11.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:47:56 +0000 (0:00:00.526) 0:00:11.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:47:56 +0000 (0:00:00.036) 0:00:11.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:47:56 +0000 (0:00:00.028) 0:00:11.651 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a RAID0 device mounted on "/opt/test1"] *************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:24 Wednesday 01 June 2022 16:47:56 +0000 (0:00:00.033) 0:00:11.684 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:47:56 +0000 (0:00:00.054) 0:00:11.739 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:47:56 +0000 (0:00:00.048) 0:00:11.787 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.497) 0:00:12.284 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.065) 0:00:12.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.029) 0:00:12.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.028) 0:00:12.408 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.089) 0:00:12.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.025) 0:00:12.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.028) 0:00:12.551 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.030) 0:00:12.582 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "raid0", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.035) 0:00:12.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.029) 0:00:12.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.027) 0:00:12.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.028) 0:00:12.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.028) 0:00:12.731 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.042) 0:00:12.774 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:47:57 +0000 (0:00:00.028) 0:00:12.802 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "mounted" } ], "packages": [ "xfsprogs", "mdadm", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:48:05 +0000 (0:00:07.457) 0:00:20.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:48:05 +0000 (0:00:00.031) 0:00:20.291 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:48:05 +0000 (0:00:00.029) 0:00:20.320 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "mounted" } ], "packages": [ "xfsprogs", "mdadm", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:48:05 +0000 (0:00:00.040) 0:00:20.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:48:05 +0000 (0:00:00.047) 0:00:20.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:48:05 +0000 (0:00:00.048) 0:00:20.457 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:48:05 +0000 (0:00:00.042) 0:00:20.499 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:48:06 +0000 (0:00:00.975) 0:00:21.474 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:48:06 +0000 (0:00:00.566) 0:00:22.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:48:07 +0000 (0:00:00.626) 0:00:22.668 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:48:07 +0000 (0:00:00.370) 0:00:23.038 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:48:07 +0000 (0:00:00.026) 0:00:23.065 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:36 Wednesday 01 June 2022 16:48:08 +0000 (0:00:00.837) 0:00:23.903 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:48:08 +0000 (0:00:00.052) 0:00:23.956 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:48:08 +0000 (0:00:00.029) 0:00:23.986 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "raid0", "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:48:08 +0000 (0:00:00.037) 0:00:24.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "20G", "type": "raid0", "uuid": "e27161a9-707b-48a1-a52f-5aa27e8afedc" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "1a252f17-88e1-e9a0-cb48-d2f44aec07c6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "1a252f17-88e1-e9a0-cb48-d2f44aec07c6" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:48:09 +0000 (0:00:00.502) 0:00:24.526 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002404", "end": "2022-06-01 12:48:09.264929", "rc": 0, "start": "2022-06-01 12:48:09.262525" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:48:09 +0000 (0:00:00.452) 0:00:24.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002949", "end": "2022-06-01 12:48:09.654302", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:48:09.651353" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.389) 0:00:25.367 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.028) 0:00:25.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.029) 0:00:25.426 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.061) 0:00:25.487 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.033) 0:00:25.521 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.116) 0:00:25.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/md127" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.033) 0:00:25.671 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "e27161a9-707b-48a1-a52f-5aa27e8afedc" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "e27161a9-707b-48a1-a52f-5aa27e8afedc" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.041) 0:00:25.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.039) 0:00:25.752 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.034) 0:00:25.786 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.034) 0:00:25.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.027) 0:00:25.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.027) 0:00:25.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.025) 0:00:25.901 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.029) 0:00:25.930 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.041) 0:00:25.972 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.030) 0:00:26.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.032) 0:00:26.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.028) 0:00:26.064 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:48:10 +0000 (0:00:00.032) 0:00:26.096 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.039) 0:00:26.136 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.037) 0:00:26.174 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102084.4731214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102084.4731214, "dev": 5, "device_type": 2431, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 8925, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102084.4731214, "nlink": 1, "path": "/dev/md/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.372) 0:00:26.546 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.036) 0:00:26.582 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.035) 0:00:26.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.034) 0:00:26.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid0" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.034) 0:00:26.686 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.037) 0:00:26.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.030) 0:00:26.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.029) 0:00:26.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.030) 0:00:26.814 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.038) 0:00:26.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.030) 0:00:26.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.032) 0:00:26.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.062) 0:00:26.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.030) 0:00:27.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.029) 0:00:27.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.038) 0:00:27.075 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:48:11 +0000 (0:00:00.032) 0:00:27.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.028) 0:00:27.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.031) 0:00:27.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.028) 0:00:27.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.029) 0:00:27.226 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/test1" ], "delta": "0:00:00.006935", "end": "2022-06-01 12:48:11.883120", "rc": 0, "start": "2022-06-01 12:48:11.876185" } STDOUT: /dev/md/test1: Version : 1.2 Creation Time : Wed Jun 1 12:47:59 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:47:59 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : test1 UUID : 1a252f17:88e1e9a0:cb48d2f4:4aec07c6 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.366) 0:00:27.593 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.039) 0:00:27.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ None\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.045) 0:00:27.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ None\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.041) 0:00:27.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.035) 0:00:27.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.034) 0:00:27.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:48:12 +0000 (0:00:00.034) 0:00:27.824 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 21474836480, "changed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.479) 0:00:28.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.032) 0:00:28.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.031) 0:00:28.368 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.033) 0:00:28.401 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.030) 0:00:28.432 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.030) 0:00:28.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.030) 0:00:28.493 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.030) 0:00:28.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.030) 0:00:28.554 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 21474836480, "changed": false, "failed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.036) 0:00:28.591 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.034) 0:00:28.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.030) 0:00:28.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.029) 0:00:28.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.029) 0:00:28.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.029) 0:00:28.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.030) 0:00:28.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.027) 0:00:28.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.027) 0:00:28.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.027) 0:00:28.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.028) 0:00:28.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:38 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.028) 0:00:28.917 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.057) 0:00:28.975 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:48:13 +0000 (0:00:00.042) 0:00:29.017 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.502) 0:00:29.519 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.067) 0:00:29.587 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.028) 0:00:29.616 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.028) 0:00:29.644 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.093) 0:00:29.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.038) 0:00:29.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.035) 0:00:29.813 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.036) 0:00:29.849 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "raid0", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.039) 0:00:29.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.030) 0:00:29.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.034) 0:00:29.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.031) 0:00:29.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.031) 0:00:30.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.045) 0:00:30.061 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:48:14 +0000 (0:00:00.028) 0:00:30.090 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:48:16 +0000 (0:00:01.365) 0:00:31.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:48:16 +0000 (0:00:00.031) 0:00:31.488 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:48:16 +0000 (0:00:00.029) 0:00:31.517 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:48:16 +0000 (0:00:00.037) 0:00:31.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:48:16 +0000 (0:00:00.036) 0:00:31.591 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:48:16 +0000 (0:00:00.037) 0:00:31.629 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:48:16 +0000 (0:00:00.030) 0:00:31.659 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:48:17 +0000 (0:00:00.667) 0:00:32.326 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:48:17 +0000 (0:00:00.391) 0:00:32.717 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:48:18 +0000 (0:00:00.663) 0:00:33.381 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:48:18 +0000 (0:00:00.355) 0:00:33.736 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:48:18 +0000 (0:00:00.030) 0:00:33.767 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:50 Wednesday 01 June 2022 16:48:19 +0000 (0:00:00.808) 0:00:34.576 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:48:19 +0000 (0:00:00.056) 0:00:34.633 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:48:19 +0000 (0:00:00.066) 0:00:34.699 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:48:19 +0000 (0:00:00.039) 0:00:34.739 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "20G", "type": "raid0", "uuid": "e27161a9-707b-48a1-a52f-5aa27e8afedc" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "1a252f17-88e1-e9a0-cb48-d2f44aec07c6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "1a252f17-88e1-e9a0-cb48-d2f44aec07c6" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:48:20 +0000 (0:00:00.387) 0:00:35.126 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002789", "end": "2022-06-01 12:48:19.769453", "rc": 0, "start": "2022-06-01 12:48:19.766664" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:48:20 +0000 (0:00:00.355) 0:00:35.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002734", "end": "2022-06-01 12:48:20.125603", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:48:20.122869" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:48:20 +0000 (0:00:00.358) 0:00:35.840 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:48:20 +0000 (0:00:00.030) 0:00:35.871 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:48:20 +0000 (0:00:00.036) 0:00:35.907 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:48:20 +0000 (0:00:00.063) 0:00:35.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:48:20 +0000 (0:00:00.039) 0:00:36.010 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.115) 0:00:36.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/md127" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.035) 0:00:36.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "e27161a9-707b-48a1-a52f-5aa27e8afedc" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 5189226, "block_size": 4096, "block_total": 5234176, "block_used": 44950, "device": "/dev/md127", "fstype": "xfs", "inode_available": 10473469, "inode_total": 10473472, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,sunit=1024,swidth=2048,noquota", "size_available": 21255069696, "size_total": 21439184896, "uuid": "e27161a9-707b-48a1-a52f-5aa27e8afedc" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.045) 0:00:36.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.039) 0:00:36.247 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.034) 0:00:36.281 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.037) 0:00:36.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.032) 0:00:36.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.029) 0:00:36.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.029) 0:00:36.410 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.030) 0:00:36.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.045) 0:00:36.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.033) 0:00:36.519 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.036) 0:00:36.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.029) 0:00:36.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.031) 0:00:36.616 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.037) 0:00:36.654 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.039) 0:00:36.693 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102084.4731214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102084.4731214, "dev": 5, "device_type": 2431, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 8925, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102084.4731214, "nlink": 1, "path": "/dev/md/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:48:21 +0000 (0:00:00.390) 0:00:37.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.036) 0:00:37.120 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.039) 0:00:37.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.033) 0:00:37.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid0" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.034) 0:00:37.228 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.032) 0:00:37.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.029) 0:00:37.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.031) 0:00:37.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.033) 0:00:37.355 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.038) 0:00:37.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.030) 0:00:37.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.030) 0:00:37.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.030) 0:00:37.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.029) 0:00:37.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.069) 0:00:37.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.038) 0:00:37.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.038) 0:00:37.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.037) 0:00:37.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.035) 0:00:37.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.035) 0:00:37.771 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:48:22 +0000 (0:00:00.037) 0:00:37.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/test1" ], "delta": "0:00:00.007275", "end": "2022-06-01 12:48:22.481528", "rc": 0, "start": "2022-06-01 12:48:22.474253" } STDOUT: /dev/md/test1: Version : 1.2 Creation Time : Wed Jun 1 12:47:59 2022 Raid Level : raid0 Array Size : 20951040 (19.98 GiB 21.45 GB) Raid Devices : 2 Total Devices : 2 Persistence : Superblock is persistent Update Time : Wed Jun 1 12:47:59 2022 State : clean Active Devices : 2 Working Devices : 2 Failed Devices : 0 Spare Devices : 0 Layout : -unknown- Chunk Size : 512K Consistency Policy : none Name : test1 UUID : 1a252f17:88e1e9a0:cb48d2f4:4aec07c6 Events : 0 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.387) 0:00:38.196 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.042) 0:00:38.238 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 0\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.040) 0:00:38.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.2\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.038) 0:00:38.318 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.041) 0:00:38.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.042) 0:00:38.402 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.046) 0:00:38.449 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 21474836480, "changed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.396) 0:00:38.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.030) 0:00:38.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.032) 0:00:38.908 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.033) 0:00:38.941 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.030) 0:00:38.972 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.030) 0:00:39.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.033) 0:00:39.036 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.032) 0:00:39.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:48:23 +0000 (0:00:00.032) 0:00:39.101 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 21474836480, "changed": false, "failed": false, "lvm": "20g", "parted": "20GiB", "size": "20 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.037) 0:00:39.139 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.032) 0:00:39.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.030) 0:00:39.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.033) 0:00:39.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.030) 0:00:39.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.030) 0:00:39.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.030) 0:00:39.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.030) 0:00:39.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.029) 0:00:39.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.034) 0:00:39.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.031) 0:00:39.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the disk device created above] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:52 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.028) 0:00:39.481 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.072) 0:00:39.553 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.043) 0:00:39.597 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:48:24 +0000 (0:00:00.522) 0:00:40.120 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.074) 0:00:40.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.032) 0:00:40.226 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.031) 0:00:40.258 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.063) 0:00:40.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.028) 0:00:40.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.078) 0:00:40.429 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.034) 0:00:40.463 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "raid0", "state": "absent", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.035) 0:00:40.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.029) 0:00:40.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.029) 0:00:40.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.029) 0:00:40.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.030) 0:00:40.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.043) 0:00:40.662 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:48:25 +0000 (0:00:00.028) 0:00:40.690 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/md/test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/md/test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:48:27 +0000 (0:00:02.026) 0:00:42.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:48:27 +0000 (0:00:00.032) 0:00:42.749 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:48:27 +0000 (0:00:00.030) 0:00:42.780 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/md/test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/md/test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:48:27 +0000 (0:00:00.039) 0:00:42.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:48:27 +0000 (0:00:00.033) 0:00:42.853 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:48:27 +0000 (0:00:00.041) 0:00:42.894 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e27161a9-707b-48a1-a52f-5aa27e8afedc" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:48:28 +0000 (0:00:00.392) 0:00:43.287 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:48:28 +0000 (0:00:00.629) 0:00:43.916 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:48:28 +0000 (0:00:00.030) 0:00:43.947 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:48:29 +0000 (0:00:00.635) 0:00:44.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:48:29 +0000 (0:00:00.371) 0:00:44.955 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:48:29 +0000 (0:00:00.032) 0:00:44.988 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:65 Wednesday 01 June 2022 16:48:30 +0000 (0:00:00.838) 0:00:45.826 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:48:30 +0000 (0:00:00.060) 0:00:45.887 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:48:30 +0000 (0:00:00.067) 0:00:45.955 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "512 KiB", "raid_device_count": 2, "raid_level": "raid0", "raid_metadata_version": "1.2", "raid_spare_count": 0, "size": 21453864960, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:48:30 +0000 (0:00:00.038) 0:00:45.994 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:48:31 +0000 (0:00:00.366) 0:00:46.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002699", "end": "2022-06-01 12:48:31.033791", "rc": 0, "start": "2022-06-01 12:48:31.031092" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:48:31 +0000 (0:00:00.386) 0:00:46.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002751", "end": "2022-06-01 12:48:31.417890", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:48:31.415139" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.384) 0:00:47.131 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.029) 0:00:47.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.029) 0:00:47.190 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.057) 0:00:47.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.034) 0:00:47.282 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.111) 0:00:47.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.037) 0:00:47.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.040) 0:00:47.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.029) 0:00:47.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.034) 0:00:47.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.029) 0:00:47.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.029) 0:00:47.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.032) 0:00:47.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.031) 0:00:47.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.035) 0:00:47.694 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.049) 0:00:47.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.026) 0:00:47.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.037) 0:00:47.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.033) 0:00:47.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.030) 0:00:47.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.029) 0:00:47.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:48:32 +0000 (0:00:00.025) 0:00:47.927 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.367) 0:00:48.295 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.036) 0:00:48.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.026) 0:00:48.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.034) 0:00:48.392 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid0" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.034) 0:00:48.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.026) 0:00:48.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.027) 0:00:48.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.028) 0:00:48.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.028) 0:00:48.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.028) 0:00:48.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.038) 0:00:48.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.064) 0:00:48.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.028) 0:00:48.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.027) 0:00:48.902 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:48.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.031) 0:00:48.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.030) 0:00:49.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.030) 0:00:49.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.029) 0:00:49.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:48:33 +0000 (0:00:00.030) 0:00:49.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.030) 0:00:49.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.031) 0:00:49.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.029) 0:00:49.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.028) 0:00:49.233 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.031) 0:00:49.264 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.029) 0:00:49.294 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.029) 0:00:49.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.030) 0:00:49.354 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.028) 0:00:49.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.029) 0:00:49.412 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.032) 0:00:49.444 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.032) 0:00:49.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.029) 0:00:49.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.030) 0:00:49.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.028) 0:00:49.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.028) 0:00:49.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.027) 0:00:49.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.029) 0:00:49.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.027) 0:00:49.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.030) 0:00:49.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.030) 0:00:49.739 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=217 changed=4 unreachable=0 failed=0 skipped=161 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:34 +0000 (0:00:00.015) 0:00:49.754 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.46s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.05s /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove_scsi_generated.yml:3 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.83s /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml:2 ------------- linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.63s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.63s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:48:35 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:36 +0000 (0:00:01.283) 0:00:01.306 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_default.yml PLAY [Ensure that the role runs with default parameters] *********************** META: ran handlers TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:48:36 +0000 (0:00:00.016) 0:00:01.322 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:48:36 +0000 (0:00:00.136) 0:00:01.459 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:48:37 +0000 (0:00:00.768) 0:00:02.227 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:48:37 +0000 (0:00:00.064) 0:00:02.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:48:37 +0000 (0:00:00.017) 0:00:02.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:48:37 +0000 (0:00:00.014) 0:00:02.324 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:48:37 +0000 (0:00:00.187) 0:00:02.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:48:37 +0000 (0:00:00.018) 0:00:02.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:48:38 +0000 (0:00:00.995) 0:00:03.525 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:48:38 +0000 (0:00:00.038) 0:00:03.563 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:48:38 +0000 (0:00:00.037) 0:00:03.601 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:48:39 +0000 (0:00:00.675) 0:00:04.277 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:48:39 +0000 (0:00:00.068) 0:00:04.345 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:48:39 +0000 (0:00:00.014) 0:00:04.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:48:39 +0000 (0:00:00.016) 0:00:04.376 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:48:39 +0000 (0:00:00.013) 0:00:04.390 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:48:40 +0000 (0:00:00.784) 0:00:05.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:48:42 +0000 (0:00:01.767) 0:00:06.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:48:42 +0000 (0:00:00.033) 0:00:06.976 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:48:42 +0000 (0:00:00.020) 0:00:06.997 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:48:42 +0000 (0:00:00.525) 0:00:07.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:48:42 +0000 (0:00:00.026) 0:00:07.549 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:48:42 +0000 (0:00:00.022) 0:00:07.571 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:48:42 +0000 (0:00:00.025) 0:00:07.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:48:42 +0000 (0:00:00.024) 0:00:07.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:48:43 +0000 (0:00:00.024) 0:00:07.645 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:48:43 +0000 (0:00:00.021) 0:00:07.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:48:43 +0000 (0:00:00.022) 0:00:07.689 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:48:43 +0000 (0:00:00.021) 0:00:07.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:48:43 +0000 (0:00:00.022) 0:00:07.733 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:48:43 +0000 (0:00:00.464) 0:00:08.197 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:48:43 +0000 (0:00:00.022) 0:00:08.219 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=21 changed=0 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:44 +0000 (0:00:00.774) 0:00:08.994 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : make sure required packages are installed --- 0.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.46s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.14s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : show storage_pools ------------------------- 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 linux-system-roles.storage : show storage_volumes ----------------------- 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 linux-system-roles.storage : Workaround for udev issue on some platforms --- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 linux-system-roles.storage : show blivet_output ------------------------- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 linux-system-roles.storage : set the list of pools for test verification --- 0.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 linux-system-roles.storage : set the list of volumes for test verification --- 0.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:48:45 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:46 +0000 (0:00:01.252) 0:00:01.275 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_default_nvme_generated.yml ************************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_default_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [Ensure that the role runs with default parameters] *********************** META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:46 +0000 (0:00:00.018) 0:00:01.293 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:48:47 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:48 +0000 (0:00:01.256) 0:00:01.280 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_default_scsi_generated.yml ************************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_default_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_default_scsi_generated.yml:3 Wednesday 01 June 2022 16:48:48 +0000 (0:00:00.017) 0:00:01.297 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_default_scsi_generated.yml:7 Wednesday 01 June 2022 16:48:49 +0000 (0:00:01.158) 0:00:02.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [Ensure that the role runs with default parameters] *********************** META: ran handlers TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:48:49 +0000 (0:00:00.025) 0:00:02.481 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:48:49 +0000 (0:00:00.152) 0:00:02.634 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:48:50 +0000 (0:00:00.502) 0:00:03.137 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:48:50 +0000 (0:00:00.072) 0:00:03.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:48:50 +0000 (0:00:00.022) 0:00:03.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:48:50 +0000 (0:00:00.023) 0:00:03.256 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:48:50 +0000 (0:00:00.192) 0:00:03.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:48:50 +0000 (0:00:00.019) 0:00:03.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:48:51 +0000 (0:00:01.008) 0:00:04.476 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:48:51 +0000 (0:00:00.045) 0:00:04.522 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:48:51 +0000 (0:00:00.048) 0:00:04.570 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:48:52 +0000 (0:00:00.692) 0:00:05.263 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:48:52 +0000 (0:00:00.081) 0:00:05.345 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:48:52 +0000 (0:00:00.020) 0:00:05.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:48:52 +0000 (0:00:00.022) 0:00:05.388 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:48:52 +0000 (0:00:00.019) 0:00:05.408 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:48:53 +0000 (0:00:00.817) 0:00:06.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:48:55 +0000 (0:00:01.765) 0:00:07.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.043) 0:00:08.033 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.027) 0:00:08.060 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.535) 0:00:08.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.029) 0:00:08.626 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.026) 0:00:08.653 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.032) 0:00:08.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.031) 0:00:08.717 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.031) 0:00:08.748 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.030) 0:00:08.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.029) 0:00:08.808 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.026) 0:00:08.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:48:55 +0000 (0:00:00.029) 0:00:08.864 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:48:56 +0000 (0:00:00.457) 0:00:09.322 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:48:56 +0000 (0:00:00.027) 0:00:09.350 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=23 changed=0 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:57 +0000 (0:00:00.817) 0:00:10.168 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.16s /tmp/tmp7247_7fr/tests/tests_default_scsi_generated.yml:3 --------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.50s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.46s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 linux-system-roles.storage : show blivet_output ------------------------- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 linux-system-roles.storage : set the list of pools for test verification --- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 linux-system-roles.storage : set the list of volumes for test verification --- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:48:57 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:48:59 +0000 (0:00:01.270) 0:00:01.294 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_deps.yml ******************************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_deps.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:2 Wednesday 01 June 2022 16:48:59 +0000 (0:00:00.011) 0:00:01.306 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:6 Wednesday 01 June 2022 16:49:00 +0000 (0:00:01.051) 0:00:02.357 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:00 +0000 (0:00:00.037) 0:00:02.395 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:00 +0000 (0:00:00.147) 0:00:02.542 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:01 +0000 (0:00:00.530) 0:00:03.072 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:01 +0000 (0:00:00.073) 0:00:03.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:01 +0000 (0:00:00.022) 0:00:03.168 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:01 +0000 (0:00:00.022) 0:00:03.191 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:01 +0000 (0:00:00.190) 0:00:03.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:01 +0000 (0:00:00.018) 0:00:03.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:02 +0000 (0:00:01.044) 0:00:04.444 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:02 +0000 (0:00:00.046) 0:00:04.491 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:02 +0000 (0:00:00.047) 0:00:04.538 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:03 +0000 (0:00:00.720) 0:00:05.259 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:49:03 +0000 (0:00:00.084) 0:00:05.343 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:49:03 +0000 (0:00:00.018) 0:00:05.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:49:03 +0000 (0:00:00.023) 0:00:05.385 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:03 +0000 (0:00:00.020) 0:00:05.405 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:04 +0000 (0:00:00.849) 0:00:06.255 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:06 +0000 (0:00:01.791) 0:00:08.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.043) 0:00:08.090 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.026) 0:00:08.117 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.520) 0:00:08.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.069) 0:00:08.707 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.026) 0:00:08.734 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.030) 0:00:08.764 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.032) 0:00:08.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.031) 0:00:08.828 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.027) 0:00:08.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.030) 0:00:08.887 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.027) 0:00:08.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:49:06 +0000 (0:00:00.028) 0:00:08.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:49:07 +0000 (0:00:00.444) 0:00:09.387 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:49:07 +0000 (0:00:00.027) 0:00:09.415 ******** ok: [/cache/rhel-x.qcow2] TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:10 Wednesday 01 June 2022 16:49:08 +0000 (0:00:00.828) 0:00:10.244 ******** included: /tmp/tmp7247_7fr/tests/run_blivet.yml for /cache/rhel-x.qcow2 TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/run_blivet.yml:2 Wednesday 01 June 2022 16:49:08 +0000 (0:00:00.043) 0:00:10.287 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2", "xfsprogs" ], "pools": [], "volumes": [] } TASK [Assert unexpected required package list is empty] ************************ task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:28 Wednesday 01 June 2022 16:49:09 +0000 (0:00:00.978) 0:00:11.265 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [test disk and ext4 package deps] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:35 Wednesday 01 June 2022 16:49:09 +0000 (0:00:00.036) 0:00:11.302 ******** included: /tmp/tmp7247_7fr/tests/run_blivet.yml for /cache/rhel-x.qcow2 TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/run_blivet.yml:2 Wednesday 01 June 2022 16:49:09 +0000 (0:00:00.041) 0:00:11.343 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [] } TASK [Assert unexpected required package list is empty] ************************ task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:47 Wednesday 01 June 2022 16:49:10 +0000 (0:00:00.964) 0:00:12.308 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [test disk and swap package deps] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:54 Wednesday 01 June 2022 16:49:10 +0000 (0:00:00.036) 0:00:12.345 ******** included: /tmp/tmp7247_7fr/tests/run_blivet.yml for /cache/rhel-x.qcow2 TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/run_blivet.yml:2 Wednesday 01 June 2022 16:49:10 +0000 (0:00:00.042) 0:00:12.387 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [Assert unexpected required package list is empty] ************************ task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:66 Wednesday 01 June 2022 16:49:11 +0000 (0:00:00.957) 0:00:13.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=31 changed=0 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 Wednesday 01 June 2022 16:49:11 +0000 (0:00:00.022) 0:00:13.366 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.05s /tmp/tmp7247_7fr/tests/tests_deps.yml:2 --------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 test lvm and xfs package deps ------------------------------------------- 0.98s /tmp/tmp7247_7fr/tests/run_blivet.yml:2 --------------------------------------- test lvm and xfs package deps ------------------------------------------- 0.96s /tmp/tmp7247_7fr/tests/run_blivet.yml:2 --------------------------------------- test lvm and xfs package deps ------------------------------------------- 0.96s /tmp/tmp7247_7fr/tests/run_blivet.yml:2 --------------------------------------- linux-system-roles.storage : make sure required packages are installed --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.44s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : Workaround for udev issue on some platforms --- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:49:12 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:49:13 +0000 (0:00:01.280) 0:00:01.303 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_deps_nvme_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_deps_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:49:13 +0000 (0:00:00.014) 0:00:01.318 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:49:14 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:49:15 +0000 (0:00:01.230) 0:00:01.254 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.23s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_deps_scsi_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_deps_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_deps_scsi_generated.yml:3 Wednesday 01 June 2022 16:49:15 +0000 (0:00:00.013) 0:00:01.268 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_deps_scsi_generated.yml:7 Wednesday 01 June 2022 16:49:16 +0000 (0:00:01.052) 0:00:02.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:2 Wednesday 01 June 2022 16:49:16 +0000 (0:00:00.024) 0:00:02.344 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:6 Wednesday 01 June 2022 16:49:17 +0000 (0:00:00.797) 0:00:03.142 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:17 +0000 (0:00:00.038) 0:00:03.181 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:17 +0000 (0:00:00.152) 0:00:03.333 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:17 +0000 (0:00:00.515) 0:00:03.849 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:17 +0000 (0:00:00.077) 0:00:03.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:18 +0000 (0:00:00.022) 0:00:03.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:18 +0000 (0:00:00.022) 0:00:03.972 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:18 +0000 (0:00:00.191) 0:00:04.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:18 +0000 (0:00:00.019) 0:00:04.182 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:19 +0000 (0:00:00.969) 0:00:05.152 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:19 +0000 (0:00:00.046) 0:00:05.199 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:19 +0000 (0:00:00.044) 0:00:05.244 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:19 +0000 (0:00:00.647) 0:00:05.891 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:49:20 +0000 (0:00:00.080) 0:00:05.971 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:49:20 +0000 (0:00:00.020) 0:00:05.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:49:20 +0000 (0:00:00.022) 0:00:06.014 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:20 +0000 (0:00:00.019) 0:00:06.034 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:20 +0000 (0:00:00.790) 0:00:06.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:22 +0000 (0:00:01.817) 0:00:08.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:22 +0000 (0:00:00.071) 0:00:08.713 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:22 +0000 (0:00:00.027) 0:00:08.741 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.536) 0:00:09.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.037) 0:00:09.315 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.027) 0:00:09.343 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.034) 0:00:09.377 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.030) 0:00:09.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.030) 0:00:09.439 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.027) 0:00:09.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.028) 0:00:09.495 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.026) 0:00:09.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:49:23 +0000 (0:00:00.028) 0:00:09.550 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:49:24 +0000 (0:00:00.454) 0:00:10.005 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:49:24 +0000 (0:00:00.028) 0:00:10.033 ******** ok: [/cache/rhel-x.qcow2] TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:10 Wednesday 01 June 2022 16:49:24 +0000 (0:00:00.829) 0:00:10.863 ******** included: /tmp/tmp7247_7fr/tests/run_blivet.yml for /cache/rhel-x.qcow2 TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/run_blivet.yml:2 Wednesday 01 June 2022 16:49:24 +0000 (0:00:00.043) 0:00:10.906 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2", "xfsprogs" ], "pools": [], "volumes": [] } TASK [Assert unexpected required package list is empty] ************************ task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:28 Wednesday 01 June 2022 16:49:25 +0000 (0:00:01.022) 0:00:11.929 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [test disk and ext4 package deps] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:35 Wednesday 01 June 2022 16:49:26 +0000 (0:00:00.036) 0:00:11.965 ******** included: /tmp/tmp7247_7fr/tests/run_blivet.yml for /cache/rhel-x.qcow2 TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/run_blivet.yml:2 Wednesday 01 June 2022 16:49:26 +0000 (0:00:00.042) 0:00:12.008 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [] } TASK [Assert unexpected required package list is empty] ************************ task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:47 Wednesday 01 June 2022 16:49:27 +0000 (0:00:01.037) 0:00:13.045 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [test disk and swap package deps] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:54 Wednesday 01 June 2022 16:49:27 +0000 (0:00:00.036) 0:00:13.081 ******** included: /tmp/tmp7247_7fr/tests/run_blivet.yml for /cache/rhel-x.qcow2 TASK [test lvm and xfs package deps] ******************************************* task path: /tmp/tmp7247_7fr/tests/run_blivet.yml:2 Wednesday 01 June 2022 16:49:27 +0000 (0:00:00.043) 0:00:13.125 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [Assert unexpected required package list is empty] ************************ task path: /tmp/tmp7247_7fr/tests/tests_deps.yml:66 Wednesday 01 June 2022 16:49:28 +0000 (0:00:01.015) 0:00:14.140 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=33 changed=0 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 Wednesday 01 June 2022 16:49:28 +0000 (0:00:00.021) 0:00:14.162 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.23s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.05s /tmp/tmp7247_7fr/tests/tests_deps_scsi_generated.yml:3 ------------------------ test lvm and xfs package deps ------------------------------------------- 1.04s /tmp/tmp7247_7fr/tests/run_blivet.yml:2 --------------------------------------- test lvm and xfs package deps ------------------------------------------- 1.02s /tmp/tmp7247_7fr/tests/run_blivet.yml:2 --------------------------------------- test lvm and xfs package deps ------------------------------------------- 1.02s /tmp/tmp7247_7fr/tests/run_blivet.yml:2 --------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 0.97s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.80s /tmp/tmp7247_7fr/tests/tests_deps.yml:2 --------------------------------------- linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.45s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:49:28 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:49:30 +0000 (0:00:01.530) 0:00:01.553 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.53s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_disk_errors.yml ************************************************ 1 plays in /tmp/tmp7247_7fr/tests/tests_disk_errors.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:2 Wednesday 01 June 2022 16:49:30 +0000 (0:00:00.031) 0:00:01.585 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:11 Wednesday 01 June 2022 16:49:31 +0000 (0:00:01.069) 0:00:02.654 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:31 +0000 (0:00:00.036) 0:00:02.691 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:31 +0000 (0:00:00.153) 0:00:02.845 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:32 +0000 (0:00:00.513) 0:00:03.359 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:32 +0000 (0:00:00.075) 0:00:03.434 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:32 +0000 (0:00:00.022) 0:00:03.457 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:32 +0000 (0:00:00.021) 0:00:03.478 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:32 +0000 (0:00:00.199) 0:00:03.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:32 +0000 (0:00:00.019) 0:00:03.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:33 +0000 (0:00:01.033) 0:00:04.730 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:33 +0000 (0:00:00.047) 0:00:04.777 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:33 +0000 (0:00:00.044) 0:00:04.822 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:34 +0000 (0:00:00.694) 0:00:05.516 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:49:34 +0000 (0:00:00.082) 0:00:05.599 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:49:34 +0000 (0:00:00.021) 0:00:05.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:49:34 +0000 (0:00:00.020) 0:00:05.642 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:34 +0000 (0:00:00.018) 0:00:05.660 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:35 +0000 (0:00:00.789) 0:00:06.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:37 +0000 (0:00:01.827) 0:00:08.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:37 +0000 (0:00:00.042) 0:00:08.319 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:37 +0000 (0:00:00.026) 0:00:08.346 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:49:37 +0000 (0:00:00.538) 0:00:08.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:37 +0000 (0:00:00.067) 0:00:08.952 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:49:37 +0000 (0:00:00.032) 0:00:08.984 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:49:37 +0000 (0:00:00.033) 0:00:09.018 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.031) 0:00:09.050 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.032) 0:00:09.082 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.028) 0:00:09.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.030) 0:00:09.141 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.028) 0:00:09.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.029) 0:00:09.199 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.446) 0:00:09.645 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:49:38 +0000 (0:00:00.026) 0:00:09.671 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:14 Wednesday 01 June 2022 16:49:39 +0000 (0:00:00.810) 0:00:10.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:21 Wednesday 01 June 2022 16:49:39 +0000 (0:00:00.030) 0:00:10.513 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:49:39 +0000 (0:00:00.043) 0:00:10.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.503) 0:00:11.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.035) 0:00:11.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.029) 0:00:11.125 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk volume mounted at "/opt/test1"] **************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:28 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.034) 0:00:11.160 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.047) 0:00:11.207 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.042) 0:00:11.250 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.499) 0:00:11.749 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.068) 0:00:11.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.029) 0:00:11.848 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.029) 0:00:11.877 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.061) 0:00:11.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.025) 0:00:11.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.029) 0:00:11.993 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:40 +0000 (0:00:00.031) 0:00:12.025 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "/dev/surelyidonotexist" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:41 +0000 (0:00:00.033) 0:00:12.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:41 +0000 (0:00:00.031) 0:00:12.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:41 +0000 (0:00:00.061) 0:00:12.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:41 +0000 (0:00:00.029) 0:00:12.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:41 +0000 (0:00:00.030) 0:00:12.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:41 +0000 (0:00:00.042) 0:00:12.254 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:41 +0000 (0:00:00.027) 0:00:12.281 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: unable to resolve disk specified for volume 'test1' (['/dev/surelyidonotexist']) TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:49:42 +0000 (0:00:00.996) 0:00:13.278 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_spare_count': None, u'size': None, u'mount_point': u'/opt/test1', u'name': u'test1', u'encryption_password': None, u'encryption': None, u'disks': [u'/dev/surelyidonotexist'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'type': u'disk', u'encryption_cipher': None, u'fs_create_options': None}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"unable to resolve disk specified for volume 'test1' (['/dev/surelyidonotexist'])"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:42 +0000 (0:00:00.039) 0:00:13.318 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:43 Wednesday 01 June 2022 16:49:42 +0000 (0:00:00.027) 0:00:13.345 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create two volumes w/ the same name] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:62 Wednesday 01 June 2022 16:49:42 +0000 (0:00:00.032) 0:00:13.378 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:42 +0000 (0:00:00.049) 0:00:13.427 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:42 +0000 (0:00:00.048) 0:00:13.476 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:42 +0000 (0:00:00.516) 0:00:13.992 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.069) 0:00:14.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.030) 0:00:14.092 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.034) 0:00:14.126 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.061) 0:00:14.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.024) 0:00:14.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.032) 0:00:14.246 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.031) 0:00:14.278 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "vol1", "type": "disk" }, { "disks": [ "sda" ], "name": "vol1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.034) 0:00:14.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.028) 0:00:14.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.028) 0:00:14.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.028) 0:00:14.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.030) 0:00:14.429 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.042) 0:00:14.471 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:43 +0000 (0:00:00.028) 0:00:14.499 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: multiple volumes with the same name: vol1 TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:49:44 +0000 (0:00:00.981) 0:00:15.481 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_spare_count': None, u'size': None, u'mount_point': None, u'name': u'vol1', u'encryption_password': None, u'encryption': None, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'type': u'disk', u'encryption_cipher': None, u'fs_create_options': None}, {u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_spare_count': None, u'size': None, u'mount_point': None, u'name': u'vol1', u'encryption_password': None, u'encryption': None, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'type': u'disk', u'encryption_cipher': None, u'fs_create_options': None}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u'multiple volumes with the same name: vol1'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:44 +0000 (0:00:00.039) 0:00:15.520 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:79 Wednesday 01 June 2022 16:49:44 +0000 (0:00:00.026) 0:00:15.547 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the duplicate volumes test] ************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:85 Wednesday 01 June 2022 16:49:44 +0000 (0:00:00.070) 0:00:15.617 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a file system on disk] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:92 Wednesday 01 June 2022 16:49:44 +0000 (0:00:00.033) 0:00:15.651 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:44 +0000 (0:00:00.045) 0:00:15.697 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:44 +0000 (0:00:00.042) 0:00:15.739 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.519) 0:00:16.258 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.073) 0:00:16.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.030) 0:00:16.363 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.030) 0:00:16.393 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.062) 0:00:16.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.024) 0:00:16.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.029) 0:00:16.510 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.031) 0:00:16.542 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.036) 0:00:16.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.028) 0:00:16.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.028) 0:00:16.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.029) 0:00:16.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.029) 0:00:16.694 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.041) 0:00:16.736 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:45 +0000 (0:00:00.030) 0:00:16.766 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:49:47 +0000 (0:00:01.332) 0:00:18.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:47 +0000 (0:00:00.030) 0:00:18.129 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:49:47 +0000 (0:00:00.028) 0:00:18.157 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:49:47 +0000 (0:00:00.035) 0:00:18.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:49:47 +0000 (0:00:00.038) 0:00:18.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:49:47 +0000 (0:00:00.038) 0:00:18.269 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:49:47 +0000 (0:00:00.030) 0:00:18.300 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:49:48 +0000 (0:00:00.968) 0:00:19.268 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:49:48 +0000 (0:00:00.539) 0:00:19.807 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:49:49 +0000 (0:00:00.640) 0:00:20.448 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:49:49 +0000 (0:00:00.357) 0:00:20.806 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:49:49 +0000 (0:00:00.029) 0:00:20.835 ******** ok: [/cache/rhel-x.qcow2] TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:103 Wednesday 01 June 2022 16:49:50 +0000 (0:00:00.882) 0:00:21.717 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Try to replace the file system on disk in safe mode] ********************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:111 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.472) 0:00:22.190 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.048) 0:00:22.239 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.044) 0:00:22.283 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.491) 0:00:22.775 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.069) 0:00:22.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.030) 0:00:22.875 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.028) 0:00:22.904 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.060) 0:00:22.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.025) 0:00:22.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:51 +0000 (0:00:00.030) 0:00:23.020 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.031) 0:00:23.051 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext3", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.036) 0:00:23.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.029) 0:00:23.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.028) 0:00:23.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.029) 0:00:23.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.030) 0:00:23.206 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.043) 0:00:23.250 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:52 +0000 (0:00:00.029) 0:00:23.279 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'test1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:49:53 +0000 (0:00:01.067) 0:00:24.347 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'test1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:53 +0000 (0:00:00.040) 0:00:24.387 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:126 Wednesday 01 June 2022 16:49:53 +0000 (0:00:00.026) 0:00:24.413 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:132 Wednesday 01 June 2022 16:49:53 +0000 (0:00:00.033) 0:00:24.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Unmount file system] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:139 Wednesday 01 June 2022 16:49:53 +0000 (0:00:00.033) 0:00:24.480 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:53 +0000 (0:00:00.046) 0:00:24.527 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:53 +0000 (0:00:00.042) 0:00:24.569 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.554) 0:00:25.124 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.071) 0:00:25.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.029) 0:00:25.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.036) 0:00:25.262 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.067) 0:00:25.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.027) 0:00:25.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.032) 0:00:25.389 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.034) 0:00:25.423 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "none", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.037) 0:00:25.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.030) 0:00:25.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.029) 0:00:25.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.029) 0:00:25.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.029) 0:00:25.579 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.045) 0:00:25.624 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:54 +0000 (0:00:00.029) 0:00:25.653 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "xfsprogs", "e2fsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:49:55 +0000 (0:00:01.048) 0:00:26.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:49:55 +0000 (0:00:00.030) 0:00:26.732 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:49:55 +0000 (0:00:00.027) 0:00:26.759 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "xfsprogs", "e2fsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:49:55 +0000 (0:00:00.037) 0:00:26.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:49:55 +0000 (0:00:00.035) 0:00:26.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:49:55 +0000 (0:00:00.035) 0:00:26.868 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:49:56 +0000 (0:00:00.377) 0:00:27.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:49:56 +0000 (0:00:00.657) 0:00:27.903 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:49:56 +0000 (0:00:00.028) 0:00:27.931 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:49:57 +0000 (0:00:00.636) 0:00:28.568 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:49:57 +0000 (0:00:00.366) 0:00:28.935 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:49:57 +0000 (0:00:00.030) 0:00:28.966 ******** ok: [/cache/rhel-x.qcow2] TASK [Try to replace the file system on disk in safe mode] ********************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:152 Wednesday 01 June 2022 16:49:58 +0000 (0:00:00.897) 0:00:29.863 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:49:58 +0000 (0:00:00.084) 0:00:29.948 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:49:58 +0000 (0:00:00.043) 0:00:29.992 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.540) 0:00:30.532 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.068) 0:00:30.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.031) 0:00:30.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.029) 0:00:30.662 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.060) 0:00:30.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.027) 0:00:30.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.029) 0:00:30.780 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.031) 0:00:30.811 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext3", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.033) 0:00:30.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.028) 0:00:30.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.028) 0:00:30.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.031) 0:00:30.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.029) 0:00:30.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.044) 0:00:31.006 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:49:59 +0000 (0:00:00.029) 0:00:31.035 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'test1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:50:01 +0000 (0:00:01.087) 0:00:32.123 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'test1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.037) 0:00:32.161 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:167 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.027) 0:00:32.189 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:173 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.035) 0:00:32.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remount file system] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:180 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.034) 0:00:32.259 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.043) 0:00:32.302 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.042) 0:00:32.344 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.522) 0:00:32.867 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.068) 0:00:32.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.033) 0:00:32.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:01 +0000 (0:00:00.031) 0:00:33.001 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.059) 0:00:33.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.026) 0:00:33.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.060) 0:00:33.148 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.032) 0:00:33.180 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.036) 0:00:33.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.029) 0:00:33.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.029) 0:00:33.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.028) 0:00:33.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.031) 0:00:33.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.042) 0:00:33.378 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:02 +0000 (0:00:00.026) 0:00:33.405 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:50:03 +0000 (0:00:01.078) 0:00:34.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:03 +0000 (0:00:00.033) 0:00:34.517 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:50:03 +0000 (0:00:00.032) 0:00:34.550 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:50:03 +0000 (0:00:00.040) 0:00:34.590 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:50:03 +0000 (0:00:00.034) 0:00:34.625 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:50:03 +0000 (0:00:00.034) 0:00:34.659 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:50:03 +0000 (0:00:00.028) 0:00:34.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:50:04 +0000 (0:00:00.630) 0:00:35.318 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:50:04 +0000 (0:00:00.384) 0:00:35.702 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:50:05 +0000 (0:00:00.636) 0:00:36.339 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:50:05 +0000 (0:00:00.356) 0:00:36.695 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:50:05 +0000 (0:00:00.028) 0:00:36.724 ******** ok: [/cache/rhel-x.qcow2] TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:191 Wednesday 01 June 2022 16:50:06 +0000 (0:00:00.816) 0:00:37.540 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102190.5541215, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102190.5541215, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102190.5541215, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "378633521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:196 Wednesday 01 June 2022 16:50:06 +0000 (0:00:00.392) 0:00:37.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create a partition pool on the disk already containing a file system in safe_mode] *** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:204 Wednesday 01 June 2022 16:50:06 +0000 (0:00:00.035) 0:00:37.969 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:06 +0000 (0:00:00.047) 0:00:38.017 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.043) 0:00:38.060 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.501) 0:00:38.562 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.071) 0:00:38.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.031) 0:00:38.665 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.031) 0:00:38.696 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.063) 0:00:38.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.026) 0:00:38.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.030) 0:00:38.816 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.036) 0:00:38.853 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.035) 0:00:38.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.078) 0:00:38.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.032) 0:00:38.999 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:07 +0000 (0:00:00.030) 0:00:39.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:08 +0000 (0:00:00.030) 0:00:39.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:08 +0000 (0:00:00.044) 0:00:39.104 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:08 +0000 (0:00:00.027) 0:00:39.132 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.998) 0:00:40.131 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.039) 0:00:40.170 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:218 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.026) 0:00:40.197 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:224 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.034) 0:00:40.232 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM pool on disk that already belongs to an existing filesystem] *** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:233 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.034) 0:00:40.266 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.049) 0:00:40.315 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.044) 0:00:40.360 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.501) 0:00:40.861 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.073) 0:00:40.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.031) 0:00:40.967 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:09 +0000 (0:00:00.030) 0:00:40.997 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.065) 0:00:41.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.027) 0:00:41.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.031) 0:00:41.121 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.034) 0:00:41.156 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.032) 0:00:41.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.028) 0:00:41.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.030) 0:00:41.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.028) 0:00:41.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.028) 0:00:41.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.042) 0:00:41.346 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:10 +0000 (0:00:00.027) 0:00:41.373 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.989) 0:00:42.363 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.070) 0:00:42.434 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:247 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.027) 0:00:42.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:253 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.032) 0:00:42.494 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:260 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.033) 0:00:42.527 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102206.2901216, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102190.5541215, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102190.5541215, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "378633521", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:265 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.375) 0:00:42.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a partition pool on the disk already containing a file system w/o safe_mode] *** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:271 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.035) 0:00:42.938 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.050) 0:00:42.988 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:11 +0000 (0:00:00.044) 0:00:43.033 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.490) 0:00:43.523 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.080) 0:00:43.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.032) 0:00:43.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.030) 0:00:43.665 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.063) 0:00:43.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.027) 0:00:43.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.030) 0:00:43.787 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.038) 0:00:43.826 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.033) 0:00:43.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.030) 0:00:43.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.038) 0:00:43.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.033) 0:00:43.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.029) 0:00:43.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:12 +0000 (0:00:00.043) 0:00:44.035 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:13 +0000 (0:00:00.027) 0:00:44.063 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:50:14 +0000 (0:00:01.335) 0:00:45.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:14 +0000 (0:00:00.031) 0:00:45.430 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:50:14 +0000 (0:00:00.027) 0:00:45.458 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:50:14 +0000 (0:00:00.036) 0:00:45.495 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:50:14 +0000 (0:00:00.035) 0:00:45.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:50:14 +0000 (0:00:00.032) 0:00:45.562 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d39aed32-aa84-4a16-aea7-de3d41f931ae" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:50:14 +0000 (0:00:00.379) 0:00:45.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:50:15 +0000 (0:00:00.652) 0:00:46.594 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:50:15 +0000 (0:00:00.029) 0:00:46.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:50:16 +0000 (0:00:00.647) 0:00:47.271 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:50:16 +0000 (0:00:00.368) 0:00:47.639 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:50:16 +0000 (0:00:00.028) 0:00:47.668 ******** ok: [/cache/rhel-x.qcow2] TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:281 Wednesday 01 June 2022 16:50:17 +0000 (0:00:00.802) 0:00:48.470 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:286 Wednesday 01 June 2022 16:50:17 +0000 (0:00:00.035) 0:00:48.506 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:17 +0000 (0:00:00.051) 0:00:48.558 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:17 +0000 (0:00:00.043) 0:00:48.601 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.504) 0:00:49.106 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.071) 0:00:49.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.029) 0:00:49.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.029) 0:00:49.237 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.061) 0:00:49.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.026) 0:00:49.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.029) 0:00:49.354 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "partition" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.040) 0:00:49.395 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.033) 0:00:49.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.029) 0:00:49.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.030) 0:00:49.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.031) 0:00:49.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.029) 0:00:49.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.041) 0:00:49.591 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:18 +0000 (0:00:00.026) 0:00:49.618 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:50:19 +0000 (0:00:01.200) 0:00:50.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:19 +0000 (0:00:00.031) 0:00:50.849 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:50:19 +0000 (0:00:00.028) 0:00:50.878 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:50:19 +0000 (0:00:00.035) 0:00:50.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:50:19 +0000 (0:00:00.035) 0:00:50.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:50:19 +0000 (0:00:00.034) 0:00:50.984 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:50:19 +0000 (0:00:00.029) 0:00:51.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:50:20 +0000 (0:00:00.028) 0:00:51.042 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:50:20 +0000 (0:00:00.028) 0:00:51.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:50:20 +0000 (0:00:00.030) 0:00:51.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:50:20 +0000 (0:00:00.354) 0:00:51.456 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:50:20 +0000 (0:00:00.029) 0:00:51.486 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=185 changed=8 unreachable=0 failed=6 skipped=119 rescued=6 ignored=0 Wednesday 01 June 2022 16:50:21 +0000 (0:00:00.835) 0:00:52.322 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.53s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.33s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:2 -------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.97s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:50:22 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:50:23 +0000 (0:00:01.319) 0:00:01.342 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_disk_errors_nvme_generated.yml ********************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_disk_errors_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:50:23 +0000 (0:00:00.036) 0:00:01.378 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:50:24 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:50:25 +0000 (0:00:01.277) 0:00:01.301 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_disk_errors_scsi_generated.yml ********************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_disk_errors_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors_scsi_generated.yml:3 Wednesday 01 June 2022 16:50:25 +0000 (0:00:00.030) 0:00:01.332 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors_scsi_generated.yml:7 Wednesday 01 June 2022 16:50:26 +0000 (0:00:01.056) 0:00:02.388 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:2 Wednesday 01 June 2022 16:50:26 +0000 (0:00:00.027) 0:00:02.415 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:11 Wednesday 01 June 2022 16:50:27 +0000 (0:00:00.814) 0:00:03.230 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:27 +0000 (0:00:00.037) 0:00:03.268 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:27 +0000 (0:00:00.153) 0:00:03.421 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:28 +0000 (0:00:00.553) 0:00:03.975 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:28 +0000 (0:00:00.076) 0:00:04.052 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:28 +0000 (0:00:00.022) 0:00:04.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:28 +0000 (0:00:00.023) 0:00:04.097 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:28 +0000 (0:00:00.194) 0:00:04.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:28 +0000 (0:00:00.018) 0:00:04.311 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:29 +0000 (0:00:01.083) 0:00:05.394 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:29 +0000 (0:00:00.047) 0:00:05.441 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:29 +0000 (0:00:00.046) 0:00:05.487 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:30 +0000 (0:00:00.655) 0:00:06.143 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:50:30 +0000 (0:00:00.081) 0:00:06.225 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:50:30 +0000 (0:00:00.020) 0:00:06.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:50:30 +0000 (0:00:00.021) 0:00:06.267 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:30 +0000 (0:00:00.019) 0:00:06.287 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:31 +0000 (0:00:00.795) 0:00:07.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:33 +0000 (0:00:01.828) 0:00:08.911 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.043) 0:00:08.955 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.028) 0:00:08.983 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.508) 0:00:09.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.029) 0:00:09.521 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.027) 0:00:09.548 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.032) 0:00:09.581 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.031) 0:00:09.613 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.030) 0:00:09.644 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.026) 0:00:09.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.027) 0:00:09.698 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.028) 0:00:09.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:50:33 +0000 (0:00:00.027) 0:00:09.753 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:50:34 +0000 (0:00:00.447) 0:00:10.200 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:50:34 +0000 (0:00:00.028) 0:00:10.229 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:14 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.828) 0:00:11.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:21 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.030) 0:00:11.087 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.044) 0:00:11.132 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.531) 0:00:11.664 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.035) 0:00:11.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.028) 0:00:11.728 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a disk volume mounted at "/opt/test1"] **************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:28 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.033) 0:00:11.761 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.048) 0:00:11.810 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:35 +0000 (0:00:00.043) 0:00:11.853 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.489) 0:00:12.343 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.100) 0:00:12.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.030) 0:00:12.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.030) 0:00:12.504 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.062) 0:00:12.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.025) 0:00:12.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.029) 0:00:12.621 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.031) 0:00:12.653 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "/dev/surelyidonotexist" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.033) 0:00:12.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.030) 0:00:12.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.032) 0:00:12.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.028) 0:00:12.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.028) 0:00:12.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.042) 0:00:12.848 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:36 +0000 (0:00:00.027) 0:00:12.876 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: unable to resolve disk specified for volume 'test1' (['/dev/surelyidonotexist']) TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:50:38 +0000 (0:00:01.042) 0:00:13.918 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_spare_count': None, u'size': None, u'mount_point': u'/opt/test1', u'name': u'test1', u'encryption_password': None, u'encryption': None, u'disks': [u'/dev/surelyidonotexist'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'type': u'disk', u'encryption_cipher': None, u'fs_create_options': None}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"unable to resolve disk specified for volume 'test1' (['/dev/surelyidonotexist'])"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.040) 0:00:13.958 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:43 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.028) 0:00:13.987 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create two volumes w/ the same name] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:62 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.032) 0:00:14.019 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.049) 0:00:14.069 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.046) 0:00:14.115 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.538) 0:00:14.654 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.070) 0:00:14.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.031) 0:00:14.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.031) 0:00:14.787 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.062) 0:00:14.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:38 +0000 (0:00:00.028) 0:00:14.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.030) 0:00:14.908 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.035) 0:00:14.944 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "vol1", "type": "disk" }, { "disks": [ "sda" ], "name": "vol1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.034) 0:00:14.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.030) 0:00:15.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.028) 0:00:15.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.031) 0:00:15.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.029) 0:00:15.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.043) 0:00:15.141 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:39 +0000 (0:00:00.025) 0:00:15.166 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: multiple volumes with the same name: vol1 TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:50:40 +0000 (0:00:01.016) 0:00:16.183 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_spare_count': None, u'size': None, u'mount_point': None, u'name': u'vol1', u'encryption_password': None, u'encryption': None, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'type': u'disk', u'encryption_cipher': None, u'fs_create_options': None}, {u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_spare_count': None, u'size': None, u'mount_point': None, u'name': u'vol1', u'encryption_password': None, u'encryption': None, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'type': u'disk', u'encryption_cipher': None, u'fs_create_options': None}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u'multiple volumes with the same name: vol1'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:40 +0000 (0:00:00.041) 0:00:16.224 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:79 Wednesday 01 June 2022 16:50:40 +0000 (0:00:00.026) 0:00:16.251 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the duplicate volumes test] ************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:85 Wednesday 01 June 2022 16:50:40 +0000 (0:00:00.032) 0:00:16.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a file system on disk] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:92 Wednesday 01 June 2022 16:50:40 +0000 (0:00:00.032) 0:00:16.315 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:40 +0000 (0:00:00.045) 0:00:16.361 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:40 +0000 (0:00:00.043) 0:00:16.404 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.508) 0:00:16.913 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.068) 0:00:16.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.029) 0:00:17.011 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.029) 0:00:17.040 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.057) 0:00:17.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.025) 0:00:17.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.028) 0:00:17.152 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.033) 0:00:17.185 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.034) 0:00:17.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.028) 0:00:17.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.028) 0:00:17.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.028) 0:00:17.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.028) 0:00:17.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.044) 0:00:17.379 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:41 +0000 (0:00:00.025) 0:00:17.405 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:50:42 +0000 (0:00:01.304) 0:00:18.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:42 +0000 (0:00:00.028) 0:00:18.738 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:50:42 +0000 (0:00:00.028) 0:00:18.767 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:50:42 +0000 (0:00:00.041) 0:00:18.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:50:42 +0000 (0:00:00.033) 0:00:18.842 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:50:42 +0000 (0:00:00.036) 0:00:18.878 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:50:43 +0000 (0:00:00.084) 0:00:18.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:50:43 +0000 (0:00:00.897) 0:00:19.861 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=48c8774b-1285-42d3-b35a-7b942687ad8c', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:50:44 +0000 (0:00:00.487) 0:00:20.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:50:45 +0000 (0:00:00.646) 0:00:20.994 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:50:45 +0000 (0:00:00.366) 0:00:21.361 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:50:45 +0000 (0:00:00.029) 0:00:21.391 ******** ok: [/cache/rhel-x.qcow2] TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:103 Wednesday 01 June 2022 16:50:46 +0000 (0:00:00.812) 0:00:22.203 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Try to replace the file system on disk in safe mode] ********************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:111 Wednesday 01 June 2022 16:50:46 +0000 (0:00:00.541) 0:00:22.745 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:46 +0000 (0:00:00.050) 0:00:22.796 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:46 +0000 (0:00:00.042) 0:00:22.839 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.544) 0:00:23.383 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.069) 0:00:23.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.030) 0:00:23.483 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.031) 0:00:23.514 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.065) 0:00:23.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.027) 0:00:23.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.031) 0:00:23.638 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.035) 0:00:23.674 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext3", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.034) 0:00:23.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.028) 0:00:23.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.029) 0:00:23.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.041) 0:00:23.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.032) 0:00:23.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:47 +0000 (0:00:00.045) 0:00:23.887 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:48 +0000 (0:00:00.083) 0:00:23.970 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'test1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:50:49 +0000 (0:00:01.084) 0:00:25.055 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'test1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.040) 0:00:25.096 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:126 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.027) 0:00:25.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:132 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.035) 0:00:25.159 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Unmount file system] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:139 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.033) 0:00:25.193 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.047) 0:00:25.240 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.045) 0:00:25.285 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.500) 0:00:25.786 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.071) 0:00:25.857 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:49 +0000 (0:00:00.030) 0:00:25.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.031) 0:00:25.920 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.064) 0:00:25.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.025) 0:00:26.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.028) 0:00:26.038 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.031) 0:00:26.070 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "none", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.035) 0:00:26.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.029) 0:00:26.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.029) 0:00:26.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.029) 0:00:26.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.028) 0:00:26.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.042) 0:00:26.264 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:50 +0000 (0:00:00.028) 0:00:26.293 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:50:51 +0000 (0:00:01.023) 0:00:27.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:51 +0000 (0:00:00.030) 0:00:27.347 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:50:51 +0000 (0:00:00.028) 0:00:27.375 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:50:51 +0000 (0:00:00.038) 0:00:27.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:50:51 +0000 (0:00:00.035) 0:00:27.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:50:51 +0000 (0:00:00.037) 0:00:27.485 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:50:51 +0000 (0:00:00.395) 0:00:27.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:50:52 +0000 (0:00:00.669) 0:00:28.550 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:50:52 +0000 (0:00:00.031) 0:00:28.582 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:50:53 +0000 (0:00:00.701) 0:00:29.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:50:53 +0000 (0:00:00.386) 0:00:29.670 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:50:53 +0000 (0:00:00.028) 0:00:29.698 ******** ok: [/cache/rhel-x.qcow2] TASK [Try to replace the file system on disk in safe mode] ********************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:152 Wednesday 01 June 2022 16:50:54 +0000 (0:00:00.829) 0:00:30.527 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:54 +0000 (0:00:00.043) 0:00:30.570 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:54 +0000 (0:00:00.043) 0:00:30.613 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.511) 0:00:31.125 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.070) 0:00:31.196 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.031) 0:00:31.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.028) 0:00:31.256 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.058) 0:00:31.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.025) 0:00:31.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.029) 0:00:31.369 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.031) 0:00:31.401 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext3", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.033) 0:00:31.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.029) 0:00:31.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.029) 0:00:31.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.031) 0:00:31.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.028) 0:00:31.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.043) 0:00:31.597 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:55 +0000 (0:00:00.027) 0:00:31.625 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'test1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:50:56 +0000 (0:00:01.044) 0:00:32.669 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'test1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:56 +0000 (0:00:00.085) 0:00:32.755 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:167 Wednesday 01 June 2022 16:50:56 +0000 (0:00:00.028) 0:00:32.784 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:173 Wednesday 01 June 2022 16:50:56 +0000 (0:00:00.032) 0:00:32.817 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remount file system] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:180 Wednesday 01 June 2022 16:50:56 +0000 (0:00:00.031) 0:00:32.848 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:50:56 +0000 (0:00:00.042) 0:00:32.891 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.043) 0:00:32.935 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.507) 0:00:33.442 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.078) 0:00:33.521 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.037) 0:00:33.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.031) 0:00:33.590 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.062) 0:00:33.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.024) 0:00:33.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.030) 0:00:33.707 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.032) 0:00:33.740 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.041) 0:00:33.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.030) 0:00:33.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.030) 0:00:33.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.029) 0:00:33.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:50:57 +0000 (0:00:00.030) 0:00:33.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:50:58 +0000 (0:00:00.044) 0:00:33.948 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:50:58 +0000 (0:00:00.026) 0:00:33.975 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:50:59 +0000 (0:00:01.028) 0:00:35.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:50:59 +0000 (0:00:00.027) 0:00:35.031 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:50:59 +0000 (0:00:00.027) 0:00:35.059 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:50:59 +0000 (0:00:00.036) 0:00:35.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:50:59 +0000 (0:00:00.032) 0:00:35.128 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:50:59 +0000 (0:00:00.035) 0:00:35.164 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:50:59 +0000 (0:00:00.027) 0:00:35.191 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:50:59 +0000 (0:00:00.673) 0:00:35.865 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=48c8774b-1285-42d3-b35a-7b942687ad8c', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:51:00 +0000 (0:00:00.404) 0:00:36.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:51:01 +0000 (0:00:00.652) 0:00:36.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:51:01 +0000 (0:00:00.374) 0:00:37.296 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:51:01 +0000 (0:00:00.029) 0:00:37.325 ******** ok: [/cache/rhel-x.qcow2] TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:191 Wednesday 01 June 2022 16:51:02 +0000 (0:00:00.830) 0:00:38.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102246.2371216, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102246.2371216, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102246.2371216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "87485144", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:196 Wednesday 01 June 2022 16:51:02 +0000 (0:00:00.385) 0:00:38.541 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create a partition pool on the disk already containing a file system in safe_mode] *** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:204 Wednesday 01 June 2022 16:51:02 +0000 (0:00:00.034) 0:00:38.576 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:02 +0000 (0:00:00.047) 0:00:38.624 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:02 +0000 (0:00:00.041) 0:00:38.665 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.540) 0:00:39.206 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.075) 0:00:39.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.031) 0:00:39.313 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.031) 0:00:39.344 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.063) 0:00:39.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.024) 0:00:39.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.029) 0:00:39.463 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.035) 0:00:39.499 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.034) 0:00:39.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.030) 0:00:39.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.029) 0:00:39.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.030) 0:00:39.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.029) 0:00:39.653 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.045) 0:00:39.698 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:03 +0000 (0:00:00.030) 0:00:39.729 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:51:04 +0000 (0:00:01.109) 0:00:40.839 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:04 +0000 (0:00:00.039) 0:00:40.879 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:218 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.027) 0:00:40.906 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:224 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.035) 0:00:40.942 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM pool on disk that already belongs to an existing filesystem] *** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:233 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.035) 0:00:40.977 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.047) 0:00:41.024 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.040) 0:00:41.065 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.574) 0:00:41.639 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.070) 0:00:41.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.030) 0:00:41.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.029) 0:00:41.770 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.061) 0:00:41.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.024) 0:00:41.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:05 +0000 (0:00:00.029) 0:00:41.885 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.035) 0:00:41.921 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.034) 0:00:41.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.029) 0:00:41.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.029) 0:00:42.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.028) 0:00:42.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.028) 0:00:42.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.043) 0:00:42.115 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:06 +0000 (0:00:00.029) 0:00:42.144 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:51:07 +0000 (0:00:01.052) 0:00:43.197 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.039) 0:00:43.236 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:247 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.027) 0:00:43.263 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:253 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.032) 0:00:43.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:260 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.034) 0:00:43.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102262.0281215, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102246.2371216, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102246.2371216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "87485144", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:265 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.384) 0:00:43.715 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a partition pool on the disk already containing a file system w/o safe_mode] *** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:271 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.036) 0:00:43.752 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.055) 0:00:43.808 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:07 +0000 (0:00:00.047) 0:00:43.855 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.528) 0:00:44.384 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.074) 0:00:44.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.032) 0:00:44.490 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.030) 0:00:44.521 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.063) 0:00:44.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.026) 0:00:44.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.030) 0:00:44.641 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.034) 0:00:44.676 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.071) 0:00:44.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.030) 0:00:44.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.029) 0:00:44.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.028) 0:00:44.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:08 +0000 (0:00:00.027) 0:00:44.864 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:09 +0000 (0:00:00.040) 0:00:44.904 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:09 +0000 (0:00:00.027) 0:00:44.932 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:51:10 +0000 (0:00:01.391) 0:00:46.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:10 +0000 (0:00:00.029) 0:00:46.353 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:51:10 +0000 (0:00:00.025) 0:00:46.378 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:51:10 +0000 (0:00:00.036) 0:00:46.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:51:10 +0000 (0:00:00.034) 0:00:46.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:51:10 +0000 (0:00:00.034) 0:00:46.484 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=48c8774b-1285-42d3-b35a-7b942687ad8c', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=48c8774b-1285-42d3-b35a-7b942687ad8c" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:51:10 +0000 (0:00:00.395) 0:00:46.880 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:51:11 +0000 (0:00:00.676) 0:00:47.556 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:51:11 +0000 (0:00:00.029) 0:00:47.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:51:12 +0000 (0:00:00.651) 0:00:48.237 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:51:12 +0000 (0:00:00.376) 0:00:48.613 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:51:12 +0000 (0:00:00.028) 0:00:48.642 ******** ok: [/cache/rhel-x.qcow2] TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:281 Wednesday 01 June 2022 16:51:13 +0000 (0:00:00.846) 0:00:49.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_disk_errors.yml:286 Wednesday 01 June 2022 16:51:13 +0000 (0:00:00.034) 0:00:49.523 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:13 +0000 (0:00:00.050) 0:00:49.573 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:13 +0000 (0:00:00.041) 0:00:49.615 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.507) 0:00:50.123 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.070) 0:00:50.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.030) 0:00:50.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.029) 0:00:50.254 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.060) 0:00:50.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.059) 0:00:50.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.030) 0:00:50.404 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "partition" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.035) 0:00:50.440 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.031) 0:00:50.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.029) 0:00:50.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.028) 0:00:50.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.037) 0:00:50.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.029) 0:00:50.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.042) 0:00:50.639 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:14 +0000 (0:00:00.026) 0:00:50.666 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:51:15 +0000 (0:00:01.217) 0:00:51.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.030) 0:00:51.914 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.028) 0:00:51.942 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.035) 0:00:51.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.033) 0:00:52.012 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.034) 0:00:52.047 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.029) 0:00:52.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.028) 0:00:52.105 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.028) 0:00:52.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.028) 0:00:52.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.382) 0:00:52.543 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:51:16 +0000 (0:00:00.027) 0:00:52.571 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=187 changed=8 unreachable=0 failed=6 skipped=119 rescued=6 ignored=0 Wednesday 01 June 2022 16:51:17 +0000 (0:00:00.854) 0:00:53.426 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.39s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.31s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.22s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.06s /tmp/tmp7247_7fr/tests/tests_disk_errors_scsi_generated.yml:3 ----------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:51:18 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:51:19 +0000 (0:00:01.266) 0:00:01.289 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_existing_lvm_pool.yml ****************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:2 Wednesday 01 June 2022 16:51:19 +0000 (0:00:00.012) 0:00:01.302 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:13 Wednesday 01 June 2022 16:51:20 +0000 (0:00:01.105) 0:00:02.408 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:20 +0000 (0:00:00.039) 0:00:02.447 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:20 +0000 (0:00:00.149) 0:00:02.597 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:21 +0000 (0:00:00.517) 0:00:03.114 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:21 +0000 (0:00:00.078) 0:00:03.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:21 +0000 (0:00:00.023) 0:00:03.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:21 +0000 (0:00:00.022) 0:00:03.239 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:21 +0000 (0:00:00.196) 0:00:03.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:21 +0000 (0:00:00.018) 0:00:03.454 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:22 +0000 (0:00:01.067) 0:00:04.521 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:22 +0000 (0:00:00.046) 0:00:04.568 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:22 +0000 (0:00:00.049) 0:00:04.618 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:23 +0000 (0:00:00.680) 0:00:05.298 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:51:23 +0000 (0:00:00.077) 0:00:05.376 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:51:23 +0000 (0:00:00.021) 0:00:05.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:51:23 +0000 (0:00:00.021) 0:00:05.418 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:23 +0000 (0:00:00.019) 0:00:05.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:24 +0000 (0:00:00.827) 0:00:06.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:26 +0000 (0:00:01.835) 0:00:08.102 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:26 +0000 (0:00:00.042) 0:00:08.145 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:26 +0000 (0:00:00.027) 0:00:08.172 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:51:26 +0000 (0:00:00.538) 0:00:08.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.028) 0:00:08.739 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.024) 0:00:08.764 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.033) 0:00:08.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.033) 0:00:08.831 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.030) 0:00:08.861 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.027) 0:00:08.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.052) 0:00:08.941 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.026) 0:00:08.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.027) 0:00:08.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.481) 0:00:09.477 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:51:27 +0000 (0:00:00.027) 0:00:09.505 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:16 Wednesday 01 June 2022 16:51:28 +0000 (0:00:00.862) 0:00:10.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:23 Wednesday 01 June 2022 16:51:28 +0000 (0:00:00.029) 0:00:10.397 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:51:28 +0000 (0:00:00.043) 0:00:10.440 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.501) 0:00:10.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.032) 0:00:10.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.027) 0:00:11.003 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create one LVM logical volume under one volume group] ******************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:28 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.032) 0:00:11.035 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.052) 0:00:11.088 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.043) 0:00:11.132 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.509) 0:00:11.641 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:29 +0000 (0:00:00.067) 0:00:11.709 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.029) 0:00:11.739 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.029) 0:00:11.768 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.059) 0:00:11.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.024) 0:00:11.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.028) 0:00:11.880 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.034) 0:00:11.914 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.032) 0:00:11.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.028) 0:00:11.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.028) 0:00:12.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.028) 0:00:12.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.028) 0:00:12.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.042) 0:00:12.102 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:30 +0000 (0:00:00.059) 0:00:12.162 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:51:32 +0000 (0:00:01.988) 0:00:14.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.028) 0:00:14.180 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.027) 0:00:14.207 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.039) 0:00:14.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.041) 0:00:14.288 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.039) 0:00:14.328 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.030) 0:00:14.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.038) 0:00:14.397 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.033) 0:00:14.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:51:32 +0000 (0:00:00.029) 0:00:14.460 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:51:33 +0000 (0:00:00.370) 0:00:14.830 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:51:33 +0000 (0:00:00.029) 0:00:14.860 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:39 Wednesday 01 June 2022 16:51:33 +0000 (0:00:00.825) 0:00:15.686 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:51:34 +0000 (0:00:00.050) 0:00:15.736 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:51:34 +0000 (0:00:00.038) 0:00:15.775 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:51:34 +0000 (0:00:00.030) 0:00:15.806 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "091c6014-0484-4e43-b319-8cd794d37671" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "rrGaCk-hR0x-XoCE-bF3k-a7jL-92oP-erSOLx" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:51:34 +0000 (0:00:00.506) 0:00:16.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.004079", "end": "2022-06-01 12:51:34.490430", "rc": 0, "start": "2022-06-01 12:51:34.486351" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:51:35 +0000 (0:00:00.514) 0:00:16.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002715", "end": "2022-06-01 12:51:34.883991", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:51:34.881276" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:51:35 +0000 (0:00:00.393) 0:00:17.220 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:51:35 +0000 (0:00:00.062) 0:00:17.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:51:35 +0000 (0:00:00.031) 0:00:17.315 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:51:35 +0000 (0:00:00.062) 0:00:17.378 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:51:35 +0000 (0:00:00.040) 0:00:17.419 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.519) 0:00:17.938 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.078) 0:00:18.017 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.037) 0:00:18.054 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.035) 0:00:18.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.034) 0:00:18.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.028) 0:00:18.152 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.043) 0:00:18.196 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.054) 0:00:18.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.030) 0:00:18.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.030) 0:00:18.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.031) 0:00:18.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.028) 0:00:18.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.028) 0:00:18.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.029) 0:00:18.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.029) 0:00:18.459 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.029) 0:00:18.489 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.070) 0:00:18.560 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.062) 0:00:18.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.031) 0:00:18.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.030) 0:00:18.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:51:36 +0000 (0:00:00.027) 0:00:18.713 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.060) 0:00:18.773 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.036) 0:00:18.810 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.035) 0:00:18.846 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.056) 0:00:18.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.034) 0:00:18.937 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.033) 0:00:18.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.029) 0:00:19.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.028) 0:00:19.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.030) 0:00:19.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.030) 0:00:19.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.030) 0:00:19.120 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.065) 0:00:19.185 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.064) 0:00:19.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.030) 0:00:19.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.031) 0:00:19.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.030) 0:00:19.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.031) 0:00:19.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.031) 0:00:19.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.029) 0:00:19.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.029) 0:00:19.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.031) 0:00:19.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.029) 0:00:19.526 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.057) 0:00:19.583 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:51:37 +0000 (0:00:00.041) 0:00:19.624 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.164) 0:00:19.789 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.036) 0:00:19.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.039) 0:00:19.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.029) 0:00:19.895 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.033) 0:00:19.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.029) 0:00:19.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.031) 0:00:19.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.029) 0:00:20.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.029) 0:00:20.050 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.030) 0:00:20.080 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.044) 0:00:20.125 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.033) 0:00:20.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.037) 0:00:20.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.029) 0:00:20.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.030) 0:00:20.256 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.036) 0:00:20.292 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.035) 0:00:20.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102291.7501216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102291.7501216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10075, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102291.7501216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:51:38 +0000 (0:00:00.381) 0:00:20.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.043) 0:00:20.753 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.037) 0:00:20.790 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.039) 0:00:20.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.031) 0:00:20.860 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.035) 0:00:20.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.028) 0:00:20.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:20.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:20.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.037) 0:00:21.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.030) 0:00:21.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.170 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.039) 0:00:21.209 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.035) 0:00:21.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.028) 0:00:21.333 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.030) 0:00:21.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.032) 0:00:21.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.030) 0:00:21.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.030) 0:00:21.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.030) 0:00:21.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.029) 0:00:21.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:51:39 +0000 (0:00:00.032) 0:00:21.578 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.524) 0:00:22.103 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.370) 0:00:22.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.036) 0:00:22.510 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.034) 0:00:22.545 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.032) 0:00:22.577 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.030) 0:00:22.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.032) 0:00:22.640 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.030) 0:00:22.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:51:40 +0000 (0:00:00.030) 0:00:22.701 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.033) 0:00:22.735 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.032) 0:00:22.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.037) 0:00:22.805 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.032776", "end": "2022-06-01 12:51:40.884164", "rc": 0, "start": "2022-06-01 12:51:40.851388" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.415) 0:00:23.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.040) 0:00:23.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.040) 0:00:23.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.031) 0:00:23.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.034) 0:00:23.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.033) 0:00:23.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.033) 0:00:23.434 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.034) 0:00:23.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.031) 0:00:23.499 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.029) 0:00:23.529 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create another volume in the existing pool, identified only by name.] **** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:41 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.030) 0:00:23.560 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.065) 0:00:23.625 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:41 +0000 (0:00:00.044) 0:00:23.669 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.512) 0:00:24.181 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.074) 0:00:24.256 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.037) 0:00:24.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.031) 0:00:24.324 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.068) 0:00:24.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.029) 0:00:24.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.031) 0:00:24.454 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "name": "foo", "volumes": [ { "fs_label": "newvol", "fs_type": "ext4", "name": "newvol", "size": "2 GiB" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.036) 0:00:24.490 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.032) 0:00:24.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.031) 0:00:24.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.031) 0:00:24.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.038) 0:00:24.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.030) 0:00:24.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.046) 0:00:24.702 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:42 +0000 (0:00:00.028) 0:00:24.730 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-newvol" ], "mounts": [], "packages": [ "xfsprogs", "lvm2", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:51:44 +0000 (0:00:01.525) 0:00:26.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.030) 0:00:26.287 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.027) 0:00:26.314 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-newvol" ], "mounts": [], "packages": [ "xfsprogs", "lvm2", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.039) 0:00:26.353 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.036) 0:00:26.390 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.035) 0:00:26.426 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.075) 0:00:26.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.030) 0:00:26.531 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.028) 0:00:26.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:51:44 +0000 (0:00:00.029) 0:00:26.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:51:45 +0000 (0:00:00.369) 0:00:26.958 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:51:45 +0000 (0:00:00.041) 0:00:27.000 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:53 Wednesday 01 June 2022 16:51:46 +0000 (0:00:00.860) 0:00:27.861 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:51:46 +0000 (0:00:00.053) 0:00:27.915 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:51:46 +0000 (0:00:00.039) 0:00:27.954 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:51:46 +0000 (0:00:00.032) 0:00:27.987 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-newvol": { "fstype": "ext4", "label": "newvol", "name": "/dev/mapper/foo-newvol", "size": "2G", "type": "lvm", "uuid": "38ec493d-3619-4b4d-8607-7dd709227756" }, "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "091c6014-0484-4e43-b319-8cd794d37671" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "rrGaCk-hR0x-XoCE-bF3k-a7jL-92oP-erSOLx" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:51:46 +0000 (0:00:00.398) 0:00:28.385 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002630", "end": "2022-06-01 12:51:46.444219", "rc": 0, "start": "2022-06-01 12:51:46.441589" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:51:47 +0000 (0:00:00.396) 0:00:28.782 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002854", "end": "2022-06-01 12:51:46.827516", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:51:46.824662" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:51:47 +0000 (0:00:00.377) 0:00:29.159 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:51:47 +0000 (0:00:00.064) 0:00:29.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:51:47 +0000 (0:00:00.031) 0:00:29.255 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:51:47 +0000 (0:00:00.063) 0:00:29.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:51:47 +0000 (0:00:00.039) 0:00:29.358 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.416) 0:00:29.775 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.041) 0:00:29.817 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.038) 0:00:29.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.035) 0:00:29.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.035) 0:00:29.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.032) 0:00:29.958 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.078) 0:00:30.037 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.056) 0:00:30.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.030) 0:00:30.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.033) 0:00:30.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.030) 0:00:30.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.030) 0:00:30.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.029) 0:00:30.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.031) 0:00:30.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.030) 0:00:30.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.032) 0:00:30.342 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.059) 0:00:30.402 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.059) 0:00:30.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.031) 0:00:30.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.030) 0:00:30.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.029) 0:00:30.553 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.060) 0:00:30.613 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.038) 0:00:30.652 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:51:48 +0000 (0:00:00.035) 0:00:30.687 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.056) 0:00:30.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.034) 0:00:30.778 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.035) 0:00:30.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.032) 0:00:30.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.035) 0:00:30.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.032) 0:00:30.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.033) 0:00:30.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.032) 0:00:30.980 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.067) 0:00:31.047 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.065) 0:00:31.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.031) 0:00:31.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.032) 0:00:31.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.029) 0:00:31.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.030) 0:00:31.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.030) 0:00:31.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.029) 0:00:31.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.029) 0:00:31.326 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.033) 0:00:31.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.030) 0:00:31.391 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.058) 0:00:31.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.035) 0:00:31.486 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.120) 0:00:31.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-newvol" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.035) 0:00:31.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.040) 0:00:31.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:51:49 +0000 (0:00:00.030) 0:00:31.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.034) 0:00:31.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.031) 0:00:31.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.071) 0:00:31.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.030) 0:00:31.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.028) 0:00:31.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.029) 0:00:31.938 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.047) 0:00:31.986 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.032) 0:00:32.019 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.035) 0:00:32.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.030) 0:00:32.085 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.030) 0:00:32.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.037) 0:00:32.153 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.036) 0:00:32.190 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102303.8571215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102303.8571215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10137, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102303.8571215, "nlink": 1, "path": "/dev/mapper/foo-newvol", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.397) 0:00:32.587 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.038) 0:00:32.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.036) 0:00:32.662 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.032) 0:00:32.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:51:50 +0000 (0:00:00.032) 0:00:32.727 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.034) 0:00:32.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:32.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:32.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.031) 0:00:32.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.036) 0:00:32.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.029) 0:00:32.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:32.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:32.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.029) 0:00:33.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.034) 0:00:33.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.037) 0:00:33.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.034) 0:00:33.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:33.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.031) 0:00:33.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:33.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.035) 0:00:33.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.031) 0:00:33.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.031) 0:00:33.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:33.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.031) 0:00:33.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.030) 0:00:33.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.036) 0:00:33.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:51:51 +0000 (0:00:00.034) 0:00:33.473 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.375) 0:00:33.848 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.384) 0:00:34.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.041) 0:00:34.275 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.037) 0:00:34.312 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.031) 0:00:34.343 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.031) 0:00:34.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.031) 0:00:34.406 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.034) 0:00:34.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.030) 0:00:34.472 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.033) 0:00:34.505 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.032) 0:00:34.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:51:52 +0000 (0:00:00.039) 0:00:34.577 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/newvol" ], "delta": "0:00:00.039840", "end": "2022-06-01 12:51:52.661724", "rc": 0, "start": "2022-06-01 12:51:52.621884" } STDOUT: LVM2_LV_NAME=newvol LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.418) 0:00:34.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.038) 0:00:35.034 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.039) 0:00:35.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.033) 0:00:35.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.032) 0:00:35.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.031) 0:00:35.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.030) 0:00:35.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.030) 0:00:35.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.064) 0:00:35.296 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.028) 0:00:35.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up.] *************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:55 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.030) 0:00:35.355 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.070) 0:00:35.425 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:51:53 +0000 (0:00:00.045) 0:00:35.471 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.523) 0:00:35.995 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.072) 0:00:36.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.033) 0:00:36.101 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.030) 0:00:36.131 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.072) 0:00:36.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.031) 0:00:36.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.034) 0:00:36.270 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "name": "foo", "state": "absent" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.036) 0:00:36.307 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.033) 0:00:36.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.031) 0:00:36.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.031) 0:00:36.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.031) 0:00:36.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.032) 0:00:36.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.046) 0:00:36.513 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:51:54 +0000 (0:00:00.028) 0:00:36.542 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:51:56 +0000 (0:00:01.978) 0:00:38.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:51:56 +0000 (0:00:00.032) 0:00:38.553 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:51:56 +0000 (0:00:00.028) 0:00:38.581 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:51:56 +0000 (0:00:00.037) 0:00:38.619 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:51:56 +0000 (0:00:00.035) 0:00:38.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:51:56 +0000 (0:00:00.034) 0:00:38.690 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:51:56 +0000 (0:00:00.030) 0:00:38.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:51:57 +0000 (0:00:00.031) 0:00:38.752 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:51:57 +0000 (0:00:00.030) 0:00:38.782 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:51:57 +0000 (0:00:00.030) 0:00:38.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:51:57 +0000 (0:00:00.382) 0:00:39.195 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:51:57 +0000 (0:00:00.030) 0:00:39.225 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:63 Wednesday 01 June 2022 16:51:58 +0000 (0:00:00.922) 0:00:40.147 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:51:58 +0000 (0:00:00.058) 0:00:40.205 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:51:58 +0000 (0:00:00.036) 0:00:40.242 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:51:58 +0000 (0:00:00.027) 0:00:40.269 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:51:58 +0000 (0:00:00.384) 0:00:40.654 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002660", "end": "2022-06-01 12:51:58.694945", "rc": 0, "start": "2022-06-01 12:51:58.692285" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.373) 0:00:41.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002882", "end": "2022-06-01 12:51:59.066190", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:51:59.063308" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.370) 0:00:41.399 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.057) 0:00:41.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.029) 0:00:41.486 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.059) 0:00:41.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.036) 0:00:41.583 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.026) 0:00:41.609 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.029) 0:00:41.639 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.038) 0:00:41.677 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:51:59 +0000 (0:00:00.036) 0:00:41.714 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.035) 0:00:41.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.029) 0:00:41.779 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.026) 0:00:41.806 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.055) 0:00:41.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.029) 0:00:41.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.029) 0:00:41.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.029) 0:00:41.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.032) 0:00:41.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.028) 0:00:42.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.029) 0:00:42.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.027) 0:00:42.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.028) 0:00:42.097 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.056) 0:00:42.154 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.026) 0:00:42.180 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.060) 0:00:42.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.036) 0:00:42.278 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.030) 0:00:42.308 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.066) 0:00:42.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.031) 0:00:42.406 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.062) 0:00:42.468 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.027) 0:00:42.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.031) 0:00:42.527 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.027) 0:00:42.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.029) 0:00:42.584 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.026) 0:00:42.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=241 changed=3 unreachable=0 failed=0 skipped=189 rescued=0 ignored=0 Wednesday 01 June 2022 16:52:00 +0000 (0:00:00.015) 0:00:42.627 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:2 -------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 parse the actual size of the volume ------------------------------------- 0.52s /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 -------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Get the canonical device path for each member device -------------------- 0.52s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Read the /etc/fstab file for volume existence --------------------------- 0.51s /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 ----------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:52:01 +0000 (0:00:00.024) 0:00:00.024 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:52:02 +0000 (0:00:01.255) 0:00:01.280 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_existing_lvm_pool_nvme_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:52:02 +0000 (0:00:00.019) 0:00:01.299 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:52:03 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:52:04 +0000 (0:00:01.273) 0:00:01.295 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_existing_lvm_pool_scsi_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool_scsi_generated.yml:3 Wednesday 01 June 2022 16:52:04 +0000 (0:00:00.015) 0:00:01.310 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool_scsi_generated.yml:7 Wednesday 01 June 2022 16:52:05 +0000 (0:00:01.070) 0:00:02.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:2 Wednesday 01 June 2022 16:52:06 +0000 (0:00:00.025) 0:00:02.406 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:13 Wednesday 01 June 2022 16:52:06 +0000 (0:00:00.786) 0:00:03.192 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:52:06 +0000 (0:00:00.039) 0:00:03.231 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:52:06 +0000 (0:00:00.157) 0:00:03.389 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:52:07 +0000 (0:00:00.543) 0:00:03.933 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:52:07 +0000 (0:00:00.076) 0:00:04.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:52:07 +0000 (0:00:00.023) 0:00:04.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:52:07 +0000 (0:00:00.023) 0:00:04.056 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:52:07 +0000 (0:00:00.192) 0:00:04.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:52:07 +0000 (0:00:00.022) 0:00:04.272 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:52:08 +0000 (0:00:01.076) 0:00:05.348 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:52:09 +0000 (0:00:00.044) 0:00:05.393 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:52:09 +0000 (0:00:00.043) 0:00:05.437 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:52:09 +0000 (0:00:00.705) 0:00:06.142 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:52:09 +0000 (0:00:00.079) 0:00:06.222 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:52:09 +0000 (0:00:00.026) 0:00:06.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:52:09 +0000 (0:00:00.024) 0:00:06.273 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:52:09 +0000 (0:00:00.020) 0:00:06.293 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:52:10 +0000 (0:00:00.861) 0:00:07.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:52:12 +0000 (0:00:01.814) 0:00:08.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:52:12 +0000 (0:00:00.044) 0:00:09.014 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:52:12 +0000 (0:00:00.063) 0:00:09.077 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.531) 0:00:09.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.029) 0:00:09.638 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.026) 0:00:09.664 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.032) 0:00:09.696 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.032) 0:00:09.729 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.030) 0:00:09.759 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.027) 0:00:09.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.028) 0:00:09.815 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.026) 0:00:09.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.027) 0:00:09.869 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.449) 0:00:10.319 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:52:13 +0000 (0:00:00.029) 0:00:10.348 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:16 Wednesday 01 June 2022 16:52:14 +0000 (0:00:00.803) 0:00:11.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:23 Wednesday 01 June 2022 16:52:14 +0000 (0:00:00.030) 0:00:11.182 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:52:14 +0000 (0:00:00.042) 0:00:11.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:52:15 +0000 (0:00:00.534) 0:00:11.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:52:15 +0000 (0:00:00.036) 0:00:11.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:52:15 +0000 (0:00:00.028) 0:00:11.824 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create one LVM logical volume under one volume group] ******************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:28 Wednesday 01 June 2022 16:52:15 +0000 (0:00:00.031) 0:00:11.856 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:52:15 +0000 (0:00:00.055) 0:00:11.911 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:52:15 +0000 (0:00:00.042) 0:00:11.953 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.539) 0:00:12.493 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.068) 0:00:12.561 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.060) 0:00:12.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.030) 0:00:12.652 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.061) 0:00:12.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.024) 0:00:12.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.030) 0:00:12.769 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.034) 0:00:12.804 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.031) 0:00:12.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.029) 0:00:12.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.029) 0:00:12.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.029) 0:00:12.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.030) 0:00:12.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.042) 0:00:12.996 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:52:16 +0000 (0:00:00.029) 0:00:13.025 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:52:18 +0000 (0:00:01.686) 0:00:14.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.029) 0:00:14.741 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.027) 0:00:14.769 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.039) 0:00:14.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.042) 0:00:14.851 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.043) 0:00:14.894 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.030) 0:00:14.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.029) 0:00:14.954 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.027) 0:00:14.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.028) 0:00:15.010 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:52:18 +0000 (0:00:00.370) 0:00:15.381 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:52:19 +0000 (0:00:00.028) 0:00:15.410 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:39 Wednesday 01 June 2022 16:52:19 +0000 (0:00:00.853) 0:00:16.263 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:52:19 +0000 (0:00:00.050) 0:00:16.314 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:52:19 +0000 (0:00:00.039) 0:00:16.354 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:52:19 +0000 (0:00:00.032) 0:00:16.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a6293297-0ffc-4812-8f1c-6e777f3c0646" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "JU44Xy-A7Bn-Up5z-CEWT-0tU4-260x-RsWE9f" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:52:20 +0000 (0:00:00.510) 0:00:16.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002680", "end": "2022-06-01 12:52:20.414088", "rc": 0, "start": "2022-06-01 12:52:20.411408" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:52:21 +0000 (0:00:00.508) 0:00:17.405 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003205", "end": "2022-06-01 12:52:20.789347", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:52:20.786142" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:52:21 +0000 (0:00:00.377) 0:00:17.783 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:52:21 +0000 (0:00:00.062) 0:00:17.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:52:21 +0000 (0:00:00.031) 0:00:17.877 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:52:21 +0000 (0:00:00.063) 0:00:17.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:52:21 +0000 (0:00:00.038) 0:00:17.979 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.477) 0:00:18.456 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.041) 0:00:18.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.037) 0:00:18.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.035) 0:00:18.571 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.035) 0:00:18.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.030) 0:00:18.637 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.040) 0:00:18.678 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.057) 0:00:18.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.029) 0:00:18.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.030) 0:00:18.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.029) 0:00:18.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.029) 0:00:18.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.031) 0:00:18.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.044) 0:00:18.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.033) 0:00:18.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.030) 0:00:18.996 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.061) 0:00:19.057 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.060) 0:00:19.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.029) 0:00:19.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.029) 0:00:19.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.031) 0:00:19.208 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.070) 0:00:19.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.035) 0:00:19.314 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:52:22 +0000 (0:00:00.033) 0:00:19.347 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.058) 0:00:19.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.034) 0:00:19.441 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.035) 0:00:19.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.030) 0:00:19.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.028) 0:00:19.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.036) 0:00:19.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.031) 0:00:19.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.029) 0:00:19.632 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.061) 0:00:19.694 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.062) 0:00:19.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.030) 0:00:19.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.030) 0:00:19.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.031) 0:00:19.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.030) 0:00:19.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.063) 0:00:19.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.030) 0:00:19.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.031) 0:00:20.005 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.029) 0:00:20.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.029) 0:00:20.064 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.059) 0:00:20.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.034) 0:00:20.158 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.126) 0:00:20.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.033) 0:00:20.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:52:23 +0000 (0:00:00.041) 0:00:20.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.030) 0:00:20.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.035) 0:00:20.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.028) 0:00:20.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.028) 0:00:20.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.028) 0:00:20.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.030) 0:00:20.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.029) 0:00:20.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.042) 0:00:20.615 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.033) 0:00:20.648 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.036) 0:00:20.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.030) 0:00:20.715 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.033) 0:00:20.748 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.036) 0:00:20.784 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.035) 0:00:20.819 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102337.6621215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102337.6621215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10275, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102337.6621215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.379) 0:00:21.199 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.034) 0:00:21.234 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.036) 0:00:21.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.033) 0:00:21.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.031) 0:00:21.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:52:24 +0000 (0:00:00.033) 0:00:21.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.028) 0:00:21.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.028) 0:00:21.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.028) 0:00:21.454 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.035) 0:00:21.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.031) 0:00:21.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.029) 0:00:21.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.030) 0:00:21.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.030) 0:00:21.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.031) 0:00:21.644 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.037) 0:00:21.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.036) 0:00:21.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.032) 0:00:21.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.028) 0:00:21.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.030) 0:00:21.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.029) 0:00:21.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.035) 0:00:21.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.037) 0:00:21.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.029) 0:00:21.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.028) 0:00:21.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.029) 0:00:22.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.038) 0:00:22.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:52:25 +0000 (0:00:00.046) 0:00:22.085 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.546) 0:00:22.631 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.376) 0:00:23.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.038) 0:00:23.046 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.037) 0:00:23.083 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.030) 0:00:23.113 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.028) 0:00:23.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.029) 0:00:23.172 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.028) 0:00:23.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.031) 0:00:23.232 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.032) 0:00:23.264 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.031) 0:00:23.295 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:52:26 +0000 (0:00:00.039) 0:00:23.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036373", "end": "2022-06-01 12:52:26.749687", "rc": 0, "start": "2022-06-01 12:52:26.713314" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.405) 0:00:23.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.039) 0:00:23.780 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.039) 0:00:23.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.032) 0:00:23.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.031) 0:00:23.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.032) 0:00:23.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.032) 0:00:23.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.032) 0:00:23.981 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.029) 0:00:24.010 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.029) 0:00:24.040 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create another volume in the existing pool, identified only by name.] **** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:41 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.030) 0:00:24.070 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.062) 0:00:24.133 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:52:27 +0000 (0:00:00.045) 0:00:24.178 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.545) 0:00:24.724 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.068) 0:00:24.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.033) 0:00:24.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.034) 0:00:24.861 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.076) 0:00:24.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.026) 0:00:24.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.032) 0:00:24.997 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "name": "foo", "volumes": [ { "fs_label": "newvol", "fs_type": "ext4", "name": "newvol", "size": "2 GiB" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.036) 0:00:25.034 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.034) 0:00:25.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.030) 0:00:25.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.029) 0:00:25.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.030) 0:00:25.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.031) 0:00:25.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.095) 0:00:25.288 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:52:28 +0000 (0:00:00.029) 0:00:25.317 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-newvol" ], "mounts": [], "packages": [ "xfsprogs", "lvm2", "e2fsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:52:30 +0000 (0:00:01.670) 0:00:26.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.032) 0:00:27.019 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.028) 0:00:27.048 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-newvol" ], "mounts": [], "packages": [ "xfsprogs", "lvm2", "e2fsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.038) 0:00:27.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.037) 0:00:27.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.034) 0:00:27.158 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.028) 0:00:27.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.029) 0:00:27.216 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.027) 0:00:27.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:52:30 +0000 (0:00:00.030) 0:00:27.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:52:31 +0000 (0:00:00.371) 0:00:27.646 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:52:31 +0000 (0:00:00.031) 0:00:27.677 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:53 Wednesday 01 June 2022 16:52:32 +0000 (0:00:00.890) 0:00:28.567 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:52:32 +0000 (0:00:00.054) 0:00:28.622 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-newvol", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-newvol", "_raw_device": "/dev/mapper/foo-newvol", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "newvol", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "newvol", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2 GiB", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:52:32 +0000 (0:00:00.040) 0:00:28.662 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:52:32 +0000 (0:00:00.028) 0:00:28.690 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-newvol": { "fstype": "ext4", "label": "newvol", "name": "/dev/mapper/foo-newvol", "size": "2G", "type": "lvm", "uuid": "6f85bb1a-5319-4ce4-8f90-0d8d4caaf52f" }, "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a6293297-0ffc-4812-8f1c-6e777f3c0646" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "JU44Xy-A7Bn-Up5z-CEWT-0tU4-260x-RsWE9f" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:52:32 +0000 (0:00:00.391) 0:00:29.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002717", "end": "2022-06-01 12:52:32.464800", "rc": 0, "start": "2022-06-01 12:52:32.462083" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:52:33 +0000 (0:00:00.374) 0:00:29.457 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002492", "end": "2022-06-01 12:52:32.829089", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:52:32.826597" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:52:33 +0000 (0:00:00.362) 0:00:29.819 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:52:33 +0000 (0:00:00.064) 0:00:29.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:52:33 +0000 (0:00:00.031) 0:00:29.915 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:52:33 +0000 (0:00:00.063) 0:00:29.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:52:33 +0000 (0:00:00.072) 0:00:30.050 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.376) 0:00:30.427 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.041) 0:00:30.469 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.037) 0:00:30.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.035) 0:00:30.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.034) 0:00:30.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.028) 0:00:30.605 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.043) 0:00:30.649 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.056) 0:00:30.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.030) 0:00:30.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.030) 0:00:30.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.031) 0:00:30.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.029) 0:00:30.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.029) 0:00:30.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.029) 0:00:30.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.030) 0:00:30.918 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.034) 0:00:30.952 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.062) 0:00:31.015 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.062) 0:00:31.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.031) 0:00:31.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.030) 0:00:31.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.029) 0:00:31.170 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.060) 0:00:31.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.035) 0:00:31.266 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.036) 0:00:31.303 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 16:52:34 +0000 (0:00:00.057) 0:00:31.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.036) 0:00:31.396 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.036) 0:00:31.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.032) 0:00:31.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.029) 0:00:31.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.027) 0:00:31.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.029) 0:00:31.551 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.031) 0:00:31.582 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.065) 0:00:31.647 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.063) 0:00:31.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.031) 0:00:31.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.033) 0:00:31.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.031) 0:00:31.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.030) 0:00:31.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.030) 0:00:31.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.031) 0:00:31.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.030) 0:00:31.930 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.035) 0:00:31.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.032) 0:00:31.998 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.059) 0:00:32.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.036) 0:00:32.094 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.162) 0:00:32.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-newvol" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.038) 0:00:32.295 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.040) 0:00:32.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:52:35 +0000 (0:00:00.030) 0:00:32.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.037) 0:00:32.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.029) 0:00:32.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.032) 0:00:32.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.034) 0:00:32.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.032) 0:00:32.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.031) 0:00:32.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.044) 0:00:32.610 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.034) 0:00:32.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.035) 0:00:32.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.032) 0:00:32.712 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.031) 0:00:32.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.036) 0:00:32.780 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.035) 0:00:32.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102349.9151216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102349.9151216, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10337, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102349.9151216, "nlink": 1, "path": "/dev/mapper/foo-newvol", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.387) 0:00:33.204 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.037) 0:00:33.242 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.033) 0:00:33.276 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.032) 0:00:33.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.030) 0:00:33.338 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:52:36 +0000 (0:00:00.035) 0:00:33.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.031) 0:00:33.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.031) 0:00:33.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.030) 0:00:33.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.040) 0:00:33.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.028) 0:00:33.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.027) 0:00:33.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.027) 0:00:33.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.027) 0:00:33.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.027) 0:00:33.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.040) 0:00:33.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.035) 0:00:33.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.029) 0:00:33.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.030) 0:00:33.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.030) 0:00:33.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.032) 0:00:33.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.031) 0:00:33.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.031) 0:00:33.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.030) 0:00:33.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.034) 0:00:33.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.033) 0:00:34.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.031) 0:00:34.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:52:37 +0000 (0:00:00.041) 0:00:34.081 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.401) 0:00:34.482 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.372) 0:00:34.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.077) 0:00:34.932 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.034) 0:00:34.967 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.029) 0:00:34.997 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.031) 0:00:35.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.031) 0:00:35.059 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.030) 0:00:35.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.032) 0:00:35.122 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.034) 0:00:35.157 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.034) 0:00:35.191 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:52:38 +0000 (0:00:00.038) 0:00:35.230 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/newvol" ], "delta": "0:00:00.033789", "end": "2022-06-01 12:52:38.666596", "rc": 0, "start": "2022-06-01 12:52:38.632807" } STDOUT: LVM2_LV_NAME=newvol LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.427) 0:00:35.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.039) 0:00:35.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.040) 0:00:35.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.032) 0:00:35.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.032) 0:00:35.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.032) 0:00:35.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.032) 0:00:35.867 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.031) 0:00:35.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.033) 0:00:35.932 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.028) 0:00:35.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up.] *************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:55 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.030) 0:00:35.991 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.074) 0:00:36.066 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:52:39 +0000 (0:00:00.045) 0:00:36.111 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.523) 0:00:36.634 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.071) 0:00:36.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.032) 0:00:36.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.031) 0:00:36.769 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.062) 0:00:36.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.027) 0:00:36.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.030) 0:00:36.890 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "name": "foo", "state": "absent" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.034) 0:00:36.924 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.032) 0:00:36.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.032) 0:00:36.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.030) 0:00:37.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.034) 0:00:37.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.031) 0:00:37.086 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.045) 0:00:37.131 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:52:40 +0000 (0:00:00.028) 0:00:37.159 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:52:42 +0000 (0:00:01.931) 0:00:39.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.032) 0:00:39.123 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.029) 0:00:39.152 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-newvol", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-newvol", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.036) 0:00:39.188 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.035) 0:00:39.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.038) 0:00:39.262 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.034) 0:00:39.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.033) 0:00:39.330 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.028) 0:00:39.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:52:42 +0000 (0:00:00.029) 0:00:39.389 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:52:43 +0000 (0:00:00.362) 0:00:39.751 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:52:43 +0000 (0:00:00.030) 0:00:39.782 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:63 Wednesday 01 June 2022 16:52:44 +0000 (0:00:00.814) 0:00:40.596 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:52:44 +0000 (0:00:00.060) 0:00:40.656 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:52:44 +0000 (0:00:00.036) 0:00:40.693 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:52:44 +0000 (0:00:00.029) 0:00:40.722 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:52:44 +0000 (0:00:00.366) 0:00:41.089 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002578", "end": "2022-06-01 12:52:44.460402", "rc": 0, "start": "2022-06-01 12:52:44.457824" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.363) 0:00:41.453 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002793", "end": "2022-06-01 12:52:44.840243", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:52:44.837450" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.379) 0:00:41.832 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.057) 0:00:41.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.030) 0:00:41.921 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.062) 0:00:41.983 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.039) 0:00:42.023 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.029) 0:00:42.052 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.027) 0:00:42.080 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.036) 0:00:42.117 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.036) 0:00:42.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.074) 0:00:42.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.030) 0:00:42.259 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.028) 0:00:42.287 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.057) 0:00:42.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:52:45 +0000 (0:00:00.031) 0:00:42.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.033) 0:00:42.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.029) 0:00:42.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.029) 0:00:42.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.027) 0:00:42.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.028) 0:00:42.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.029) 0:00:42.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.032) 0:00:42.588 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.058) 0:00:42.647 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.028) 0:00:42.675 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.062) 0:00:42.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.034) 0:00:42.772 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.026) 0:00:42.799 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.026) 0:00:42.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.030) 0:00:42.856 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.064) 0:00:42.920 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.029) 0:00:42.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.031) 0:00:42.982 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.027) 0:00:43.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.032) 0:00:43.042 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.030) 0:00:43.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=243 changed=3 unreachable=0 failed=0 skipped=189 rescued=0 ignored=0 Wednesday 01 June 2022 16:52:46 +0000 (0:00:00.016) 0:00:43.088 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool_scsi_generated.yml:3 ----------- linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.79s /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml:2 -------------------------- linux-system-roles.storage : get required packages ---------------------- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 parse the actual size of the volume ------------------------------------- 0.55s /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 -------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Find unused disks in the system ----------------------------------------- 0.53s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:52:47 +0000 (0:00:00.024) 0:00:00.024 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:52:48 +0000 (0:00:01.265) 0:00:01.289 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_cache_volume.yml **************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:2 Wednesday 01 June 2022 16:52:48 +0000 (0:00:00.014) 0:00:01.303 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:13 Wednesday 01 June 2022 16:52:49 +0000 (0:00:01.086) 0:00:02.390 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:52:49 +0000 (0:00:00.037) 0:00:02.427 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:52:50 +0000 (0:00:00.152) 0:00:02.580 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:52:50 +0000 (0:00:00.520) 0:00:03.100 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:52:50 +0000 (0:00:00.073) 0:00:03.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:52:50 +0000 (0:00:00.022) 0:00:03.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:52:50 +0000 (0:00:00.022) 0:00:03.219 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:52:50 +0000 (0:00:00.194) 0:00:03.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:52:50 +0000 (0:00:00.019) 0:00:03.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:52:51 +0000 (0:00:01.046) 0:00:04.480 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:52:51 +0000 (0:00:00.044) 0:00:04.525 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:52:52 +0000 (0:00:00.045) 0:00:04.570 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:52:52 +0000 (0:00:00.706) 0:00:05.276 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:52:52 +0000 (0:00:00.076) 0:00:05.353 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:52:52 +0000 (0:00:00.019) 0:00:05.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:52:52 +0000 (0:00:00.020) 0:00:05.394 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:52:52 +0000 (0:00:00.018) 0:00:05.412 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:52:53 +0000 (0:00:00.862) 0:00:06.275 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:52:55 +0000 (0:00:01.799) 0:00:08.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:52:55 +0000 (0:00:00.043) 0:00:08.117 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:52:55 +0000 (0:00:00.026) 0:00:08.144 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.521) 0:00:08.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.030) 0:00:08.696 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.027) 0:00:08.723 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.032) 0:00:08.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.032) 0:00:08.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.032) 0:00:08.820 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.031) 0:00:08.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.055) 0:00:08.907 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.028) 0:00:08.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.028) 0:00:08.964 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.475) 0:00:09.440 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:52:56 +0000 (0:00:00.028) 0:00:09.468 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:16 Wednesday 01 June 2022 16:52:57 +0000 (0:00:00.826) 0:00:10.295 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:23 Wednesday 01 June 2022 16:52:57 +0000 (0:00:00.030) 0:00:10.325 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:52:57 +0000 (0:00:00.043) 0:00:10.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:52:58 +0000 (0:00:00.476) 0:00:10.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:52:58 +0000 (0:00:00.035) 0:00:10.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:52:58 +0000 (0:00:00.029) 0:00:10.911 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create cached partition] ************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:30 Wednesday 01 June 2022 16:52:58 +0000 (0:00:00.032) 0:00:10.944 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:52:58 +0000 (0:00:00.047) 0:00:10.991 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:52:58 +0000 (0:00:00.040) 0:00:11.031 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:52:58 +0000 (0:00:00.492) 0:00:11.524 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.067) 0:00:11.591 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.029) 0:00:11.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.029) 0:00:11.651 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.059) 0:00:11.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.024) 0:00:11.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.028) 0:00:11.764 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "sda", "type": "partition", "volumes": [ { "cached": true, "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.035) 0:00:11.800 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.031) 0:00:11.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.030) 0:00:11.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.029) 0:00:11.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.028) 0:00:11.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.028) 0:00:11.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.043) 0:00:11.990 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:52:59 +0000 (0:00:00.026) 0:00:12.017 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: caching is not supported for partition volumes TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:53:00 +0000 (0:00:01.075) 0:00:13.093 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'sda', u'encryption_password': None, u'encryption': False, u'disks': [u'sda', u'sdb'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': True, u'type': u'partition', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u'caching is not supported for partition volumes'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:00 +0000 (0:00:00.042) 0:00:13.136 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:48 Wednesday 01 June 2022 16:53:00 +0000 (0:00:00.029) 0:00:13.165 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create cached volume] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:56 Wednesday 01 June 2022 16:53:00 +0000 (0:00:00.034) 0:00:13.199 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:00 +0000 (0:00:00.084) 0:00:13.284 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:00 +0000 (0:00:00.043) 0:00:13.327 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.511) 0:00:13.839 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.075) 0:00:13.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.036) 0:00:13.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.030) 0:00:13.981 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.062) 0:00:14.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.024) 0:00:14.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.030) 0:00:14.099 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "cache_devices": [ "sdb" ], "cache_size": "4g", "cached": true, "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.038) 0:00:14.138 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.033) 0:00:14.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.030) 0:00:14.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.032) 0:00:14.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.028) 0:00:14.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.030) 0:00:14.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.042) 0:00:14.336 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:01 +0000 (0:00:00.027) 0:00:14.363 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cache device 'sdb' doesn't seems to be a physical volume or its parent TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:53:02 +0000 (0:00:01.116) 0:00:15.480 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'5g', u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': True, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': u'4g', u'raid_chunk_size': None, u'cache_devices': [u'sdb'], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cache device 'sdb' doesn't seems to be a physical volume or its parent"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:02 +0000 (0:00:00.039) 0:00:15.520 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:75 Wednesday 01 June 2022 16:53:02 +0000 (0:00:00.028) 0:00:15.549 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=47 changed=0 unreachable=0 failed=2 skipped=29 rescued=2 ignored=0 Wednesday 01 June 2022 16:53:03 +0000 (0:00:00.019) 0:00:15.569 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.12s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:2 ------------------------ linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : make sure required packages are installed --- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.49s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Find unused disks in the system ----------------------------------------- 0.48s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.48s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ Create cached volume ---------------------------------------------------- 0.08s /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:56 ----------------------- linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:53:03 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:05 +0000 (0:00:01.268) 0:00:01.291 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_cache_volume_nvme_generated.yml ************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:05 +0000 (0:00:00.016) 0:00:01.308 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:53:05 +0000 (0:00:00.021) 0:00:00.021 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:07 +0000 (0:00:01.283) 0:00:01.305 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_cache_volume_scsi_generated.yml ************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume_scsi_generated.yml:3 Wednesday 01 June 2022 16:53:07 +0000 (0:00:00.014) 0:00:01.319 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume_scsi_generated.yml:7 Wednesday 01 June 2022 16:53:08 +0000 (0:00:01.079) 0:00:02.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:2 Wednesday 01 June 2022 16:53:08 +0000 (0:00:00.025) 0:00:02.424 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:13 Wednesday 01 June 2022 16:53:08 +0000 (0:00:00.818) 0:00:03.242 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:09 +0000 (0:00:00.036) 0:00:03.279 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:09 +0000 (0:00:00.154) 0:00:03.433 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:09 +0000 (0:00:00.513) 0:00:03.947 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:09 +0000 (0:00:00.076) 0:00:04.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:09 +0000 (0:00:00.022) 0:00:04.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:09 +0000 (0:00:00.022) 0:00:04.067 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:10 +0000 (0:00:00.197) 0:00:04.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:10 +0000 (0:00:00.018) 0:00:04.284 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:11 +0000 (0:00:01.060) 0:00:05.345 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:11 +0000 (0:00:00.047) 0:00:05.392 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:11 +0000 (0:00:00.044) 0:00:05.436 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:11 +0000 (0:00:00.701) 0:00:06.138 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:53:11 +0000 (0:00:00.080) 0:00:06.218 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:53:11 +0000 (0:00:00.020) 0:00:06.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:53:12 +0000 (0:00:00.022) 0:00:06.261 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:12 +0000 (0:00:00.018) 0:00:06.279 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:12 +0000 (0:00:00.793) 0:00:07.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:14 +0000 (0:00:01.774) 0:00:08.847 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:14 +0000 (0:00:00.040) 0:00:08.888 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:14 +0000 (0:00:00.024) 0:00:08.912 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.502) 0:00:09.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.030) 0:00:09.445 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.026) 0:00:09.472 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.033) 0:00:09.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.033) 0:00:09.539 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.032) 0:00:09.572 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.031) 0:00:09.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.028) 0:00:09.632 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.027) 0:00:09.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.028) 0:00:09.687 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.481) 0:00:10.169 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:53:15 +0000 (0:00:00.028) 0:00:10.197 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:16 Wednesday 01 June 2022 16:53:16 +0000 (0:00:00.800) 0:00:10.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:23 Wednesday 01 June 2022 16:53:16 +0000 (0:00:00.030) 0:00:11.028 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:53:16 +0000 (0:00:00.043) 0:00:11.071 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:53:17 +0000 (0:00:00.514) 0:00:11.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:53:17 +0000 (0:00:00.035) 0:00:11.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:53:17 +0000 (0:00:00.029) 0:00:11.651 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create cached partition] ************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:30 Wednesday 01 June 2022 16:53:17 +0000 (0:00:00.032) 0:00:11.684 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:17 +0000 (0:00:00.047) 0:00:11.732 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:17 +0000 (0:00:00.045) 0:00:11.777 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.510) 0:00:12.288 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.067) 0:00:12.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.029) 0:00:12.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.029) 0:00:12.414 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.089) 0:00:12.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.024) 0:00:12.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.028) 0:00:12.557 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "sda", "type": "partition", "volumes": [ { "cached": true, "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.033) 0:00:12.591 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.033) 0:00:12.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.028) 0:00:12.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.030) 0:00:12.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.029) 0:00:12.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.028) 0:00:12.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.042) 0:00:12.785 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:18 +0000 (0:00:00.029) 0:00:12.814 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: caching is not supported for partition volumes TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:53:19 +0000 (0:00:01.049) 0:00:13.864 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'sda', u'encryption_password': None, u'encryption': False, u'disks': [u'sda', u'sdb'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': True, u'type': u'partition', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u'caching is not supported for partition volumes'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:19 +0000 (0:00:00.040) 0:00:13.905 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:48 Wednesday 01 June 2022 16:53:19 +0000 (0:00:00.026) 0:00:13.931 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create cached volume] **************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:56 Wednesday 01 June 2022 16:53:19 +0000 (0:00:00.033) 0:00:13.965 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:19 +0000 (0:00:00.047) 0:00:14.012 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:19 +0000 (0:00:00.048) 0:00:14.060 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.821) 0:00:14.882 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.072) 0:00:14.954 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.030) 0:00:14.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.031) 0:00:15.016 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.061) 0:00:15.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.027) 0:00:15.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.029) 0:00:15.135 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "cache_devices": [ "sdb" ], "cache_size": "4g", "cached": true, "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.036) 0:00:15.172 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.031) 0:00:15.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:20 +0000 (0:00:00.029) 0:00:15.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:21 +0000 (0:00:00.028) 0:00:15.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:21 +0000 (0:00:00.030) 0:00:15.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:21 +0000 (0:00:00.029) 0:00:15.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:21 +0000 (0:00:00.042) 0:00:15.364 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:21 +0000 (0:00:00.026) 0:00:15.391 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cache device 'sdb' doesn't seems to be a physical volume or its parent TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:53:22 +0000 (0:00:01.055) 0:00:16.447 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'5g', u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': True, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': u'4g', u'raid_chunk_size': None, u'cache_devices': [u'sdb'], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cache device 'sdb' doesn't seems to be a physical volume or its parent"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:22 +0000 (0:00:00.042) 0:00:16.489 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:75 Wednesday 01 June 2022 16:53:22 +0000 (0:00:00.026) 0:00:16.515 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=49 changed=0 unreachable=0 failed=2 skipped=29 rescued=2 ignored=0 Wednesday 01 June 2022 16:53:22 +0000 (0:00:00.018) 0:00:16.534 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume_scsi_generated.yml:3 --------- linux-system-roles.storage : make sure blivet is available -------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml:2 ------------------------ linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Find unused disks in the system ----------------------------------------- 0.51s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.50s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.48s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : include the appropriate provider tasks ----- 0.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:53:23 +0000 (0:00:00.024) 0:00:00.024 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:24 +0000 (0:00:01.274) 0:00:01.298 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_raid_pool.yml ******************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:2 Wednesday 01 June 2022 16:53:24 +0000 (0:00:00.011) 0:00:01.310 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:17 Wednesday 01 June 2022 16:53:25 +0000 (0:00:01.082) 0:00:02.393 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:25 +0000 (0:00:00.034) 0:00:02.428 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:25 +0000 (0:00:00.153) 0:00:02.581 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:26 +0000 (0:00:00.512) 0:00:03.093 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:26 +0000 (0:00:00.077) 0:00:03.171 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:26 +0000 (0:00:00.023) 0:00:03.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:26 +0000 (0:00:00.022) 0:00:03.217 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:26 +0000 (0:00:00.199) 0:00:03.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:26 +0000 (0:00:00.018) 0:00:03.435 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:27 +0000 (0:00:01.095) 0:00:04.531 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:27 +0000 (0:00:00.045) 0:00:04.577 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:27 +0000 (0:00:00.045) 0:00:04.623 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:28 +0000 (0:00:00.687) 0:00:05.310 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:53:28 +0000 (0:00:00.078) 0:00:05.389 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:53:28 +0000 (0:00:00.020) 0:00:05.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:53:28 +0000 (0:00:00.020) 0:00:05.431 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:28 +0000 (0:00:00.020) 0:00:05.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:29 +0000 (0:00:00.807) 0:00:06.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:31 +0000 (0:00:01.765) 0:00:08.024 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:31 +0000 (0:00:00.042) 0:00:08.067 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:31 +0000 (0:00:00.026) 0:00:08.093 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:53:31 +0000 (0:00:00.737) 0:00:08.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:31 +0000 (0:00:00.076) 0:00:08.907 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:53:31 +0000 (0:00:00.028) 0:00:08.935 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.033) 0:00:08.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.032) 0:00:09.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.031) 0:00:09.033 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.026) 0:00:09.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.030) 0:00:09.090 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.026) 0:00:09.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.025) 0:00:09.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.466) 0:00:09.609 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:53:32 +0000 (0:00:00.027) 0:00:09.637 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:20 Wednesday 01 June 2022 16:53:33 +0000 (0:00:00.795) 0:00:10.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:27 Wednesday 01 June 2022 16:53:33 +0000 (0:00:00.029) 0:00:10.462 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:53:33 +0000 (0:00:00.041) 0:00:10.504 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.515) 0:00:11.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.035) 0:00:11.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.029) 0:00:11.085 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a disk volume mounted at "{{ mount_location }}"] ****************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:33 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.033) 0:00:11.118 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.046) 0:00:11.164 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.040) 0:00:11.205 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.492) 0:00:11.698 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.066) 0:00:11.764 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.030) 0:00:11.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.028) 0:00:11.823 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.059) 0:00:11.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.024) 0:00:11.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:34 +0000 (0:00:00.030) 0:00:11.937 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "fail", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.037) 0:00:11.974 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.032) 0:00:12.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.058) 0:00:12.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.029) 0:00:12.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.028) 0:00:12.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.027) 0:00:12.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.042) 0:00:12.194 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:35 +0000 (0:00:00.030) 0:00:12.224 ******** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: blivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level level = levels.raid_level(value) File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level raise RaidError("invalid RAID level descriptor %s" % descriptor) blivet.errors.RaidError: invalid RAID level descriptor fail During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level level = self._get_level(value, self._levels) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level raise ValueError(message % {"raid_level": value, "levels": choices}) ValueError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.9/runpy.py", line 210, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1665, in run_module File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1320, in manage_pool File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1220, in manage File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1280, in _create File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1173, in _create_members File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray return MDRaidArrayDevice(name, *args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__ raise e File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__ self.level = level File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level raise errors.DeviceError(e) blivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). MODULE_STDERR: Shared connection to 127.0.0.3 closed. TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:53:36 +0000 (0:00:01.088) 0:00:13.313 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'exception': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor fail\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1665, in run_module\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1320, in manage_pool\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1220, in manage\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1280, in _create\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1173, in _create_members\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'_ansible_no_log': False, u'module_stderr': u'Shared connection to 127.0.0.3 closed.\r\n', u'changed': False, u'module_stdout': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor fail\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102415.33-66056-161883465135431/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1665, in run_module\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1320, in manage_pool\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1220, in manage\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1280, in _create\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1173, in _create_members\r\n File "/tmp/ansible_blivet_payload_2onmw8pa/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'failed': True, u'rc': 1, u'msg': u'MODULE FAILURE\nSee stdout/stderr for the exact error'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:36 +0000 (0:00:00.035) 0:00:13.349 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:59 Wednesday 01 June 2022 16:53:36 +0000 (0:00:00.028) 0:00:13.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=37 changed=0 unreachable=0 failed=1 skipped=21 rescued=1 ignored=0 Wednesday 01 June 2022 16:53:36 +0000 (0:00:00.018) 0:00:13.395 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:2 --------------------------- linux-system-roles.storage : make sure required packages are installed --- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Find unused disks in the system ----------------------------------------- 0.52s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.49s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.47s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : Workaround for udev issue on some platforms --- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 linux-system-roles.storage : Set platform/version specific variables ---- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : include the appropriate provider tasks ----- 0.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:53:37 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:38 +0000 (0:00:01.298) 0:00:01.321 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_raid_pool_nvme_generated.yml **************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:38 +0000 (0:00:00.014) 0:00:01.336 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:53:39 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:40 +0000 (0:00:01.252) 0:00:01.275 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_raid_pool_scsi_generated.yml **************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool_scsi_generated.yml:3 Wednesday 01 June 2022 16:53:40 +0000 (0:00:00.013) 0:00:01.289 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool_scsi_generated.yml:7 Wednesday 01 June 2022 16:53:41 +0000 (0:00:01.072) 0:00:02.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:2 Wednesday 01 June 2022 16:53:41 +0000 (0:00:00.026) 0:00:02.387 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:17 Wednesday 01 June 2022 16:53:42 +0000 (0:00:00.794) 0:00:03.182 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:42 +0000 (0:00:00.036) 0:00:03.218 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:42 +0000 (0:00:00.165) 0:00:03.383 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:43 +0000 (0:00:00.540) 0:00:03.924 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:43 +0000 (0:00:00.077) 0:00:04.002 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:43 +0000 (0:00:00.023) 0:00:04.025 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:43 +0000 (0:00:00.022) 0:00:04.047 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:43 +0000 (0:00:00.192) 0:00:04.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:43 +0000 (0:00:00.020) 0:00:04.260 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:44 +0000 (0:00:01.057) 0:00:05.318 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:44 +0000 (0:00:00.051) 0:00:05.369 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:44 +0000 (0:00:00.044) 0:00:05.414 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:45 +0000 (0:00:00.693) 0:00:06.107 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:53:45 +0000 (0:00:00.079) 0:00:06.187 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:53:45 +0000 (0:00:00.020) 0:00:06.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:53:45 +0000 (0:00:00.021) 0:00:06.228 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:45 +0000 (0:00:00.021) 0:00:06.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:46 +0000 (0:00:00.794) 0:00:07.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:48 +0000 (0:00:01.825) 0:00:08.869 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.040) 0:00:08.910 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.024) 0:00:08.935 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.504) 0:00:09.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.029) 0:00:09.470 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.026) 0:00:09.497 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.032) 0:00:09.529 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.031) 0:00:09.561 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.031) 0:00:09.593 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.029) 0:00:09.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.029) 0:00:09.652 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.027) 0:00:09.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:53:48 +0000 (0:00:00.027) 0:00:09.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:53:49 +0000 (0:00:00.446) 0:00:10.153 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:53:49 +0000 (0:00:00.028) 0:00:10.182 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:20 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.775) 0:00:10.958 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:27 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.029) 0:00:10.987 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.041) 0:00:11.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.519) 0:00:11.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.036) 0:00:11.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.028) 0:00:11.613 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a disk volume mounted at "{{ mount_location }}"] ****************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:33 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.034) 0:00:11.647 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.046) 0:00:11.694 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:50 +0000 (0:00:00.042) 0:00:11.736 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.542) 0:00:12.279 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.068) 0:00:12.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.029) 0:00:12.377 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.028) 0:00:12.405 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.061) 0:00:12.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.024) 0:00:12.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.028) 0:00:12.520 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "fail", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.039) 0:00:12.559 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.031) 0:00:12.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.030) 0:00:12.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.028) 0:00:12.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.027) 0:00:12.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.028) 0:00:12.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.042) 0:00:12.748 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:53:51 +0000 (0:00:00.028) 0:00:12.777 ******** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: blivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level level = levels.raid_level(value) File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level raise RaidError("invalid RAID level descriptor %s" % descriptor) blivet.errors.RaidError: invalid RAID level descriptor fail During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level level = self._get_level(value, self._levels) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level raise ValueError(message % {"raid_level": value, "levels": choices}) ValueError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.9/runpy.py", line 210, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1665, in run_module File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1320, in manage_pool File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1220, in manage File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1280, in _create File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1173, in _create_members File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray return MDRaidArrayDevice(name, *args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__ raise e File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__ self.level = level File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level raise errors.DeviceError(e) blivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). MODULE_STDERR: Shared connection to 127.0.0.3 closed. TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:53:53 +0000 (0:00:01.112) 0:00:13.889 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'exception': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor fail\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1665, in run_module\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1320, in manage_pool\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1220, in manage\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1280, in _create\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1173, in _create_members\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'_ansible_no_log': False, u'module_stderr': u'Shared connection to 127.0.0.3 closed.\r\n', u'changed': False, u'module_stdout': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor fail\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102432.06-66480-214227205851842/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1665, in run_module\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1320, in manage_pool\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1220, in manage\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1280, in _create\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1173, in _create_members\r\n File "/tmp/ansible_blivet_payload_lrsrgpdy/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level fail is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'failed': True, u'rc': 1, u'msg': u'MODULE FAILURE\nSee stdout/stderr for the exact error'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:53:53 +0000 (0:00:00.036) 0:00:13.926 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:59 Wednesday 01 June 2022 16:53:53 +0000 (0:00:00.028) 0:00:13.954 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=39 changed=0 unreachable=0 failed=1 skipped=21 rescued=1 ignored=0 Wednesday 01 June 2022 16:53:53 +0000 (0:00:00.018) 0:00:13.973 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool_scsi_generated.yml:3 ------------ linux-system-roles.storage : make sure blivet is available -------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.79s /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml:2 --------------------------- linux-system-roles.storage : Update facts ------------------------------- 0.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Find unused disks in the system ----------------------------------------- 0.52s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.50s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.45s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : Set platform/version specific variables ---- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : include the appropriate provider tasks ----- 0.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:53:53 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:53:55 +0000 (0:00:01.268) 0:00:01.290 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_raid_volume.yml ***************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:2 Wednesday 01 June 2022 16:53:55 +0000 (0:00:00.011) 0:00:01.301 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:10 Wednesday 01 June 2022 16:53:56 +0000 (0:00:01.091) 0:00:02.393 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:53:56 +0000 (0:00:00.035) 0:00:02.429 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:53:56 +0000 (0:00:00.148) 0:00:02.577 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:53:57 +0000 (0:00:00.527) 0:00:03.105 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:53:57 +0000 (0:00:00.080) 0:00:03.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:53:57 +0000 (0:00:00.021) 0:00:03.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:53:57 +0000 (0:00:00.022) 0:00:03.230 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:53:57 +0000 (0:00:00.191) 0:00:03.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:53:57 +0000 (0:00:00.018) 0:00:03.439 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:53:58 +0000 (0:00:01.058) 0:00:04.498 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:53:58 +0000 (0:00:00.048) 0:00:04.547 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:53:58 +0000 (0:00:00.044) 0:00:04.592 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:53:59 +0000 (0:00:00.673) 0:00:05.265 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:53:59 +0000 (0:00:00.079) 0:00:05.344 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:53:59 +0000 (0:00:00.021) 0:00:05.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:53:59 +0000 (0:00:00.022) 0:00:05.389 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:53:59 +0000 (0:00:00.020) 0:00:05.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:00 +0000 (0:00:00.824) 0:00:06.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:01 +0000 (0:00:01.799) 0:00:08.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:01 +0000 (0:00:00.043) 0:00:08.077 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.028) 0:00:08.106 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.512) 0:00:08.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.036) 0:00:08.656 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.027) 0:00:08.683 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.030) 0:00:08.714 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.033) 0:00:08.748 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.031) 0:00:08.780 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.026) 0:00:08.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.030) 0:00:08.838 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.028) 0:00:08.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:54:02 +0000 (0:00:00.028) 0:00:08.895 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:54:03 +0000 (0:00:00.458) 0:00:09.353 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:54:03 +0000 (0:00:00.027) 0:00:09.381 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:13 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.833) 0:00:10.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:20 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.030) 0:00:10.245 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.042) 0:00:10.287 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.494) 0:00:10.782 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.036) 0:00:10.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.029) 0:00:10.848 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a RAID0 device mounted on "/opt/test1"] *************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:26 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.036) 0:00:10.885 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.046) 0:00:10.931 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:54:04 +0000 (0:00:00.042) 0:00:10.974 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.527) 0:00:11.502 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.068) 0:00:11.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.031) 0:00:11.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.029) 0:00:11.632 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.057) 0:00:11.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.057) 0:00:11.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.029) 0:00:11.777 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.031) 0:00:11.808 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "null", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.037) 0:00:11.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.031) 0:00:11.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.030) 0:00:11.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.032) 0:00:11.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.030) 0:00:11.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.040) 0:00:12.011 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:05 +0000 (0:00:00.027) 0:00:12.039 ******** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: blivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid1, raid4, linear, raid5, raid6, raid0, raid10). fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level level = levels.raid_level(value) File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level raise RaidError("invalid RAID level descriptor %s" % descriptor) blivet.errors.RaidError: invalid RAID level descriptor null During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level level = self._get_level(value, self._levels) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level raise ValueError(message % {"raid_level": value, "levels": choices}) ValueError: RAID level null is an invalid value. Must be one of (raid1, raid4, linear, raid5, raid6, raid0, raid10). During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.9/runpy.py", line 210, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray return MDRaidArrayDevice(name, *args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__ raise e File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__ self.level = level File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level raise errors.DeviceError(e) blivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid1, raid4, linear, raid5, raid6, raid0, raid10). MODULE_STDERR: Shared connection to 127.0.0.3 closed. TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:54:07 +0000 (0:00:01.143) 0:00:13.183 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'exception': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor null\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level null is an invalid value. Must be one of (raid1, raid4, linear, raid5, raid6, raid0, raid10).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid1, raid4, linear, raid5, raid6, raid0, raid10).\r\n', u'_ansible_no_log': False, u'module_stderr': u'Shared connection to 127.0.0.3 closed.\r\n', u'changed': False, u'module_stdout': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor null\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level null is an invalid value. Must be one of (raid1, raid4, linear, raid5, raid6, raid0, raid10).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102446.0-66827-127171078112202/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create\r\n File "/tmp/ansible_blivet_payload_thyz6k92/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid1, raid4, linear, raid5, raid6, raid0, raid10).\r\n', u'failed': True, u'rc': 1, u'msg': u'MODULE FAILURE\nSee stdout/stderr for the exact error'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:07 +0000 (0:00:00.035) 0:00:13.218 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:43 Wednesday 01 June 2022 16:54:07 +0000 (0:00:00.026) 0:00:13.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=37 changed=0 unreachable=0 failed=1 skipped=21 rescued=1 ignored=0 Wednesday 01 June 2022 16:54:07 +0000 (0:00:00.018) 0:00:13.263 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.14s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:2 ------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Find unused disks in the system ----------------------------------------- 0.49s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.46s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : get a list of rpm packages installed on host machine --- 0.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 linux-system-roles.storage : include the appropriate provider tasks ----- 0.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:54:07 +0000 (0:00:00.026) 0:00:00.026 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:54:09 +0000 (0:00:01.242) 0:00:01.269 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_raid_volume_nvme_generated.yml ************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:54:09 +0000 (0:00:00.015) 0:00:01.284 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:54:09 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:54:11 +0000 (0:00:01.281) 0:00:01.304 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_fatals_raid_volume_scsi_generated.yml ************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume_scsi_generated.yml:3 Wednesday 01 June 2022 16:54:11 +0000 (0:00:00.012) 0:00:01.317 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume_scsi_generated.yml:7 Wednesday 01 June 2022 16:54:12 +0000 (0:00:01.073) 0:00:02.390 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:2 Wednesday 01 June 2022 16:54:12 +0000 (0:00:00.024) 0:00:02.415 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:10 Wednesday 01 June 2022 16:54:13 +0000 (0:00:00.819) 0:00:03.234 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:54:13 +0000 (0:00:00.035) 0:00:03.269 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:54:13 +0000 (0:00:00.147) 0:00:03.417 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:54:13 +0000 (0:00:00.517) 0:00:03.935 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:54:13 +0000 (0:00:00.074) 0:00:04.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:54:13 +0000 (0:00:00.022) 0:00:04.032 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:54:13 +0000 (0:00:00.022) 0:00:04.055 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:54:14 +0000 (0:00:00.191) 0:00:04.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:54:14 +0000 (0:00:00.020) 0:00:04.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:54:15 +0000 (0:00:01.043) 0:00:05.311 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:54:15 +0000 (0:00:00.046) 0:00:05.358 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:54:15 +0000 (0:00:00.050) 0:00:05.408 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:54:15 +0000 (0:00:00.684) 0:00:06.093 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:54:16 +0000 (0:00:00.080) 0:00:06.174 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:54:16 +0000 (0:00:00.020) 0:00:06.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:54:16 +0000 (0:00:00.022) 0:00:06.217 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:54:16 +0000 (0:00:00.019) 0:00:06.237 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:16 +0000 (0:00:00.824) 0:00:07.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:18 +0000 (0:00:01.777) 0:00:08.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:18 +0000 (0:00:00.043) 0:00:08.882 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:18 +0000 (0:00:00.027) 0:00:08.910 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.494) 0:00:09.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.030) 0:00:09.435 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.026) 0:00:09.461 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.031) 0:00:09.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.031) 0:00:09.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.030) 0:00:09.555 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.026) 0:00:09.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.030) 0:00:09.612 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.028) 0:00:09.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.028) 0:00:09.669 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:54:19 +0000 (0:00:00.441) 0:00:10.111 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:54:20 +0000 (0:00:00.028) 0:00:10.139 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:13 Wednesday 01 June 2022 16:54:20 +0000 (0:00:00.900) 0:00:11.040 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:20 Wednesday 01 June 2022 16:54:20 +0000 (0:00:00.028) 0:00:11.069 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:54:20 +0000 (0:00:00.042) 0:00:11.111 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:54:21 +0000 (0:00:00.525) 0:00:11.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:54:21 +0000 (0:00:00.035) 0:00:11.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:54:21 +0000 (0:00:00.030) 0:00:11.703 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a RAID0 device mounted on "/opt/test1"] *************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:26 Wednesday 01 June 2022 16:54:21 +0000 (0:00:00.033) 0:00:11.737 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:54:21 +0000 (0:00:00.046) 0:00:11.783 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:54:21 +0000 (0:00:00.043) 0:00:11.827 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.506) 0:00:12.333 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.067) 0:00:12.400 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.030) 0:00:12.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.029) 0:00:12.460 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.094) 0:00:12.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.026) 0:00:12.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.028) 0:00:12.610 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.032) 0:00:12.642 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb" ], "mount_point": "/opt/test1", "name": "test1", "raid_level": "null", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.042) 0:00:12.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.029) 0:00:12.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.030) 0:00:12.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.031) 0:00:12.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.028) 0:00:12.805 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.042) 0:00:12.847 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:22 +0000 (0:00:00.027) 0:00:12.874 ******** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: blivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level level = levels.raid_level(value) File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level raise RaidError("invalid RAID level descriptor %s" % descriptor) blivet.errors.RaidError: invalid RAID level descriptor null During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level level = self._get_level(value, self._levels) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level raise ValueError(message % {"raid_level": value, "levels": choices}) ValueError: RAID level null is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.9/runpy.py", line 210, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray return MDRaidArrayDevice(name, *args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__ raise e File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__ self.level = level File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level raise errors.DeviceError(e) blivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). MODULE_STDERR: Shared connection to 127.0.0.3 closed. TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:54:23 +0000 (0:00:01.156) 0:00:14.031 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'exception': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor null\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level null is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'_ansible_no_log': False, u'module_stderr': u'Shared connection to 127.0.0.3 closed.\r\n', u'changed': False, u'module_stdout': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor null\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level null is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654102462.82-67251-248142728481352/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create\r\n File "/tmp/ansible_blivet_payload_789nj7ku/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level null is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'failed': True, u'rc': 1, u'msg': u'MODULE FAILURE\nSee stdout/stderr for the exact error'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:23 +0000 (0:00:00.034) 0:00:14.066 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:43 Wednesday 01 June 2022 16:54:23 +0000 (0:00:00.028) 0:00:14.095 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=39 changed=0 unreachable=0 failed=1 skipped=21 rescued=1 ignored=0 Wednesday 01 June 2022 16:54:24 +0000 (0:00:00.019) 0:00:14.115 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume_scsi_generated.yml:3 ---------- linux-system-roles.storage : make sure blivet is available -------------- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml:2 ------------------------- linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Find unused disks in the system ----------------------------------------- 0.53s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.49s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.44s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.15s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : include the appropriate provider tasks ----- 0.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : Set platform/version specific variables ---- 0.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:54:24 +0000 (0:00:00.021) 0:00:00.021 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:54:26 +0000 (0:00:01.281) 0:00:01.303 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_filesystem_one_disk.yml **************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:2 Wednesday 01 June 2022 16:54:26 +0000 (0:00:00.012) 0:00:01.315 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:8 Wednesday 01 June 2022 16:54:27 +0000 (0:00:01.067) 0:00:02.383 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:54:27 +0000 (0:00:00.036) 0:00:02.419 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:54:27 +0000 (0:00:00.149) 0:00:02.569 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:54:27 +0000 (0:00:00.511) 0:00:03.080 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:54:27 +0000 (0:00:00.084) 0:00:03.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:54:27 +0000 (0:00:00.026) 0:00:03.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:54:27 +0000 (0:00:00.026) 0:00:03.218 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:54:28 +0000 (0:00:00.191) 0:00:03.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:54:28 +0000 (0:00:00.018) 0:00:03.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:54:29 +0000 (0:00:01.052) 0:00:04.480 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:54:29 +0000 (0:00:00.047) 0:00:04.528 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:54:29 +0000 (0:00:00.044) 0:00:04.573 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:54:29 +0000 (0:00:00.681) 0:00:05.255 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:54:30 +0000 (0:00:00.078) 0:00:05.334 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:54:30 +0000 (0:00:00.019) 0:00:05.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:54:30 +0000 (0:00:00.020) 0:00:05.373 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:54:30 +0000 (0:00:00.017) 0:00:05.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:30 +0000 (0:00:00.783) 0:00:06.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:32 +0000 (0:00:01.776) 0:00:07.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:32 +0000 (0:00:00.043) 0:00:07.995 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:32 +0000 (0:00:00.026) 0:00:08.022 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.502) 0:00:08.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.029) 0:00:08.553 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.025) 0:00:08.578 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.032) 0:00:08.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.031) 0:00:08.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.030) 0:00:08.673 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.027) 0:00:08.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.028) 0:00:08.730 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.059) 0:00:08.789 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.028) 0:00:08.818 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.446) 0:00:09.264 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:54:33 +0000 (0:00:00.027) 0:00:09.292 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:11 Wednesday 01 June 2022 16:54:34 +0000 (0:00:00.815) 0:00:10.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:18 Wednesday 01 June 2022 16:54:34 +0000 (0:00:00.030) 0:00:10.137 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:54:34 +0000 (0:00:00.043) 0:00:10.181 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:54:35 +0000 (0:00:00.473) 0:00:10.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:54:35 +0000 (0:00:00.034) 0:00:10.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:54:35 +0000 (0:00:00.028) 0:00:10.718 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Initialize a disk device with the default fs type] *********************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:22 Wednesday 01 June 2022 16:54:35 +0000 (0:00:00.030) 0:00:10.749 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:54:35 +0000 (0:00:00.052) 0:00:10.801 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:54:35 +0000 (0:00:00.041) 0:00:10.843 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.507) 0:00:11.350 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.067) 0:00:11.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.029) 0:00:11.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.028) 0:00:11.476 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.056) 0:00:11.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.026) 0:00:11.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.028) 0:00:11.589 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.029) 0:00:11.619 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.033) 0:00:11.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.028) 0:00:11.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.029) 0:00:11.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.030) 0:00:11.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.029) 0:00:11.769 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.041) 0:00:11.811 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:36 +0000 (0:00:00.027) 0:00:11.839 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:54:37 +0000 (0:00:01.238) 0:00:13.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:37 +0000 (0:00:00.030) 0:00:13.108 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:54:37 +0000 (0:00:00.028) 0:00:13.136 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:54:37 +0000 (0:00:00.043) 0:00:13.180 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:54:37 +0000 (0:00:00.036) 0:00:13.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:54:37 +0000 (0:00:00.036) 0:00:13.252 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:54:37 +0000 (0:00:00.027) 0:00:13.280 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:54:38 +0000 (0:00:00.890) 0:00:14.171 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=29066eb5-34fd-42ee-a360-2382e989cde0', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:54:39 +0000 (0:00:00.537) 0:00:14.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:54:40 +0000 (0:00:00.627) 0:00:15.336 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:54:40 +0000 (0:00:00.352) 0:00:15.689 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:54:40 +0000 (0:00:00.029) 0:00:15.719 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:32 Wednesday 01 June 2022 16:54:41 +0000 (0:00:00.840) 0:00:16.559 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:54:41 +0000 (0:00:00.051) 0:00:16.611 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:54:41 +0000 (0:00:00.027) 0:00:16.638 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:54:41 +0000 (0:00:00.035) 0:00:16.673 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "29066eb5-34fd-42ee-a360-2382e989cde0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:54:41 +0000 (0:00:00.544) 0:00:17.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002783", "end": "2022-06-01 12:54:41.841402", "rc": 0, "start": "2022-06-01 12:54:41.838619" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=29066eb5-34fd-42ee-a360-2382e989cde0 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:54:42 +0000 (0:00:00.521) 0:00:17.739 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002631", "end": "2022-06-01 12:54:42.226994", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:54:42.224363" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:54:42 +0000 (0:00:00.385) 0:00:18.124 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:54:42 +0000 (0:00:00.026) 0:00:18.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:54:42 +0000 (0:00:00.032) 0:00:18.183 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:54:42 +0000 (0:00:00.063) 0:00:18.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:54:42 +0000 (0:00:00.034) 0:00:18.281 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.145) 0:00:18.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.033) 0:00:18.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "29066eb5-34fd-42ee-a360-2382e989cde0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "29066eb5-34fd-42ee-a360-2382e989cde0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.040) 0:00:18.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.037) 0:00:18.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.033) 0:00:18.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.034) 0:00:18.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.027) 0:00:18.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.027) 0:00:18.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.028) 0:00:18.690 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.030) 0:00:18.721 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=29066eb5-34fd-42ee-a360-2382e989cde0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.043) 0:00:18.764 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.032) 0:00:18.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.033) 0:00:18.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.028) 0:00:18.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.029) 0:00:18.888 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.036) 0:00:18.924 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:54:43 +0000 (0:00:00.034) 0:00:18.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102477.1191216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102477.1191216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102477.1191216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.398) 0:00:19.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.042) 0:00:19.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.040) 0:00:19.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.035) 0:00:19.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.029) 0:00:19.505 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.035) 0:00:19.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.035) 0:00:19.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.029) 0:00:19.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.029) 0:00:19.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.035) 0:00:19.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.033) 0:00:19.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.029) 0:00:19.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:19.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.029) 0:00:19.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:20.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:20.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.031) 0:00:20.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:20.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:20.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.029) 0:00:20.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:20.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:20.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.030) 0:00:20.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:54:44 +0000 (0:00:00.028) 0:00:20.284 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.030) 0:00:20.315 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.028) 0:00:20.344 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.028) 0:00:20.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.028) 0:00:20.401 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.030) 0:00:20.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.028) 0:00:20.461 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.032) 0:00:20.494 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.030) 0:00:20.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.036) 0:00:20.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.027) 0:00:20.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.029) 0:00:20.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.063) 0:00:20.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.029) 0:00:20.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.028) 0:00:20.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.028) 0:00:20.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.029) 0:00:20.799 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.030) 0:00:20.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:34 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.031) 0:00:20.861 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.056) 0:00:20.918 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:54:45 +0000 (0:00:00.042) 0:00:20.961 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.501) 0:00:21.462 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.070) 0:00:21.532 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.029) 0:00:21.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.029) 0:00:21.591 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.060) 0:00:21.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.024) 0:00:21.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.028) 0:00:21.705 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.031) 0:00:21.737 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.034) 0:00:21.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.031) 0:00:21.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.029) 0:00:21.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.030) 0:00:21.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.030) 0:00:21.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.043) 0:00:21.938 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:46 +0000 (0:00:00.028) 0:00:21.966 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:54:47 +0000 (0:00:01.071) 0:00:23.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:47 +0000 (0:00:00.030) 0:00:23.068 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:54:47 +0000 (0:00:00.027) 0:00:23.095 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:54:47 +0000 (0:00:00.036) 0:00:23.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:54:47 +0000 (0:00:00.048) 0:00:23.180 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:54:47 +0000 (0:00:00.045) 0:00:23.226 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:54:47 +0000 (0:00:00.030) 0:00:23.256 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:54:48 +0000 (0:00:00.642) 0:00:23.899 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=29066eb5-34fd-42ee-a360-2382e989cde0', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:54:48 +0000 (0:00:00.379) 0:00:24.279 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:54:49 +0000 (0:00:00.618) 0:00:24.898 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:54:49 +0000 (0:00:00.369) 0:00:25.267 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:54:50 +0000 (0:00:00.028) 0:00:25.295 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:44 Wednesday 01 June 2022 16:54:50 +0000 (0:00:00.893) 0:00:26.189 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:54:51 +0000 (0:00:00.108) 0:00:26.298 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:54:51 +0000 (0:00:00.030) 0:00:26.328 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:54:51 +0000 (0:00:00.037) 0:00:26.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "29066eb5-34fd-42ee-a360-2382e989cde0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:54:51 +0000 (0:00:00.367) 0:00:26.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002582", "end": "2022-06-01 12:54:51.202815", "rc": 0, "start": "2022-06-01 12:54:51.200233" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=29066eb5-34fd-42ee-a360-2382e989cde0 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:54:51 +0000 (0:00:00.366) 0:00:27.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002451", "end": "2022-06-01 12:54:51.567156", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:54:51.564705" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.363) 0:00:27.464 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.028) 0:00:27.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.028) 0:00:27.522 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.058) 0:00:27.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.034) 0:00:27.615 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.105) 0:00:27.720 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.035) 0:00:27.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "29066eb5-34fd-42ee-a360-2382e989cde0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "29066eb5-34fd-42ee-a360-2382e989cde0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.040) 0:00:27.796 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.041) 0:00:27.838 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.039) 0:00:27.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.037) 0:00:27.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.033) 0:00:27.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.037) 0:00:27.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.028) 0:00:28.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.026) 0:00:28.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=29066eb5-34fd-42ee-a360-2382e989cde0 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.041) 0:00:28.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.031) 0:00:28.115 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.039) 0:00:28.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.038) 0:00:28.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.032) 0:00:28.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:54:52 +0000 (0:00:00.035) 0:00:28.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.037) 0:00:28.298 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102477.1191216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102477.1191216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102477.1191216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.389) 0:00:28.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.036) 0:00:28.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.034) 0:00:28.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.031) 0:00:28.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.029) 0:00:28.820 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.032) 0:00:28.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.030) 0:00:28.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.029) 0:00:28.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.029) 0:00:28.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.036) 0:00:28.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.030) 0:00:29.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.028) 0:00:29.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.028) 0:00:29.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.027) 0:00:29.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.029) 0:00:29.121 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.034) 0:00:29.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.086) 0:00:29.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:54:53 +0000 (0:00:00.032) 0:00:29.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.031) 0:00:29.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:29.336 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.030) 0:00:29.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:29.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.031) 0:00:29.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.027) 0:00:29.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.027) 0:00:29.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:29.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.027) 0:00:29.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.027) 0:00:29.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:29.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.031) 0:00:29.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.028) 0:00:29.657 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:29.686 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.026) 0:00:29.713 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.027) 0:00:29.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:29.770 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.027) 0:00:29.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.030) 0:00:29.827 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.031) 0:00:29.859 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:29.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.030) 0:00:29.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.042) 0:00:29.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.034) 0:00:29.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.031) 0:00:30.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:30.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.030) 0:00:30.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.030) 0:00:30.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.032) 0:00:30.150 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:30.180 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:46 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.029) 0:00:30.209 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:54:54 +0000 (0:00:00.071) 0:00:30.281 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.045) 0:00:30.326 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.530) 0:00:30.857 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.072) 0:00:30.929 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.032) 0:00:30.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.031) 0:00:30.993 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.058) 0:00:31.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.025) 0:00:31.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.030) 0:00:31.108 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.032) 0:00:31.141 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.035) 0:00:31.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.029) 0:00:31.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.030) 0:00:31.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:54:55 +0000 (0:00:00.029) 0:00:31.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:54:56 +0000 (0:00:00.031) 0:00:31.297 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:54:56 +0000 (0:00:00.045) 0:00:31.342 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:54:56 +0000 (0:00:00.028) 0:00:31.370 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:54:57 +0000 (0:00:01.299) 0:00:32.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:54:57 +0000 (0:00:00.030) 0:00:32.700 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:54:57 +0000 (0:00:00.027) 0:00:32.728 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:54:57 +0000 (0:00:00.035) 0:00:32.764 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:54:57 +0000 (0:00:00.032) 0:00:32.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:54:57 +0000 (0:00:00.033) 0:00:32.830 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=29066eb5-34fd-42ee-a360-2382e989cde0', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:54:57 +0000 (0:00:00.394) 0:00:33.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:54:58 +0000 (0:00:00.631) 0:00:33.856 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:54:58 +0000 (0:00:00.029) 0:00:33.885 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:54:59 +0000 (0:00:00.621) 0:00:34.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:54:59 +0000 (0:00:00.354) 0:00:34.861 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:54:59 +0000 (0:00:00.031) 0:00:34.893 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:57 Wednesday 01 June 2022 16:55:00 +0000 (0:00:00.800) 0:00:35.694 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:55:00 +0000 (0:00:00.057) 0:00:35.752 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:55:00 +0000 (0:00:00.031) 0:00:35.783 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=29066eb5-34fd-42ee-a360-2382e989cde0", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:55:00 +0000 (0:00:00.037) 0:00:35.820 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:55:00 +0000 (0:00:00.379) 0:00:36.200 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002598", "end": "2022-06-01 12:55:00.667265", "rc": 0, "start": "2022-06-01 12:55:00.664667" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.367) 0:00:36.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002723", "end": "2022-06-01 12:55:01.046228", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:55:01.043505" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.376) 0:00:36.944 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.066) 0:00:37.011 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.029) 0:00:37.041 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.060) 0:00:37.101 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.034) 0:00:37.136 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.106) 0:00:37.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:55:01 +0000 (0:00:00.034) 0:00:37.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.037) 0:00:37.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.029) 0:00:37.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.031) 0:00:37.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.027) 0:00:37.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.029) 0:00:37.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.029) 0:00:37.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.029) 0:00:37.491 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.033) 0:00:37.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.045) 0:00:37.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.025) 0:00:37.596 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.034) 0:00:37.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.028) 0:00:37.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.029) 0:00:37.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.030) 0:00:37.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.024) 0:00:37.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102496.7101214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102496.7101214, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102496.7101214, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.395) 0:00:38.140 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.041) 0:00:38.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.029) 0:00:38.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.031) 0:00:38.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:55:02 +0000 (0:00:00.028) 0:00:38.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:38.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:38.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.038) 0:00:38.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.035) 0:00:38.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.026) 0:00:38.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:38.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.032) 0:00:38.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:38.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:38.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:38.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.037) 0:00:38.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.034) 0:00:38.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.031) 0:00:38.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:38.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.028) 0:00:38.747 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:38.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:38.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:38.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.031) 0:00:38.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.028) 0:00:38.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:38.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:38.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.028) 0:00:38.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:39.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.033) 0:00:39.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:39.077 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.039) 0:00:39.117 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.029) 0:00:39.147 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.030) 0:00:39.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.031) 0:00:39.209 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:55:03 +0000 (0:00:00.033) 0:00:39.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.068) 0:00:39.311 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.034) 0:00:39.345 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.031) 0:00:39.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.028) 0:00:39.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.028) 0:00:39.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.027) 0:00:39.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.030) 0:00:39.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.028) 0:00:39.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.028) 0:00:39.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.028) 0:00:39.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.028) 0:00:39.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.034) 0:00:39.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=199 changed=4 unreachable=0 failed=0 skipped=177 rescued=0 ignored=0 Wednesday 01 June 2022 16:55:04 +0000 (0:00:00.014) 0:00:39.655 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.30s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.24s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:2 ------------------------ linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.63s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.63s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.62s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.62s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Collect info about the volumes. ----------------------------------------- 0.54s /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 ----------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:55:05 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:55:06 +0000 (0:00:01.275) 0:00:01.299 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_filesystem_one_disk_nvme_generated.yml ************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:55:06 +0000 (0:00:00.016) 0:00:01.315 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:55:07 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:55:08 +0000 (0:00:01.259) 0:00:01.282 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_filesystem_one_disk_scsi_generated.yml ************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk_scsi_generated.yml:3 Wednesday 01 June 2022 16:55:08 +0000 (0:00:00.013) 0:00:01.296 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk_scsi_generated.yml:7 Wednesday 01 June 2022 16:55:09 +0000 (0:00:01.102) 0:00:02.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:2 Wednesday 01 June 2022 16:55:09 +0000 (0:00:00.024) 0:00:02.423 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:8 Wednesday 01 June 2022 16:55:10 +0000 (0:00:00.779) 0:00:03.203 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:55:10 +0000 (0:00:00.039) 0:00:03.243 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:55:10 +0000 (0:00:00.152) 0:00:03.395 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:55:11 +0000 (0:00:00.526) 0:00:03.921 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:55:11 +0000 (0:00:00.074) 0:00:03.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:55:11 +0000 (0:00:00.022) 0:00:04.018 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:55:11 +0000 (0:00:00.021) 0:00:04.040 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:55:11 +0000 (0:00:00.187) 0:00:04.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:55:11 +0000 (0:00:00.018) 0:00:04.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:55:12 +0000 (0:00:01.051) 0:00:05.298 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:55:12 +0000 (0:00:00.049) 0:00:05.347 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:55:12 +0000 (0:00:00.046) 0:00:05.394 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:55:13 +0000 (0:00:00.696) 0:00:06.091 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:55:13 +0000 (0:00:00.081) 0:00:06.172 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:55:13 +0000 (0:00:00.022) 0:00:06.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:55:13 +0000 (0:00:00.021) 0:00:06.216 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:55:13 +0000 (0:00:00.020) 0:00:06.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:55:14 +0000 (0:00:00.785) 0:00:07.021 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:55:15 +0000 (0:00:01.817) 0:00:08.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:55:15 +0000 (0:00:00.040) 0:00:08.879 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.029) 0:00:08.909 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.499) 0:00:09.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.030) 0:00:09.439 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.027) 0:00:09.467 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.035) 0:00:09.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.034) 0:00:09.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.038) 0:00:09.575 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.027) 0:00:09.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.028) 0:00:09.631 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.026) 0:00:09.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:55:16 +0000 (0:00:00.030) 0:00:09.689 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:55:17 +0000 (0:00:00.461) 0:00:10.150 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:55:17 +0000 (0:00:00.027) 0:00:10.178 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:11 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.820) 0:00:10.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:18 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.030) 0:00:11.028 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.043) 0:00:11.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.523) 0:00:11.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.035) 0:00:11.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.029) 0:00:11.660 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Initialize a disk device with the default fs type] *********************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:22 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.030) 0:00:11.691 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.054) 0:00:11.745 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:55:18 +0000 (0:00:00.043) 0:00:11.788 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.516) 0:00:12.304 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.068) 0:00:12.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.030) 0:00:12.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.027) 0:00:12.431 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.059) 0:00:12.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.055) 0:00:12.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.030) 0:00:12.576 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.031) 0:00:12.608 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.034) 0:00:12.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.029) 0:00:12.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.028) 0:00:12.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.030) 0:00:12.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.028) 0:00:12.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.040) 0:00:12.800 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:55:19 +0000 (0:00:00.026) 0:00:12.826 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:55:21 +0000 (0:00:01.255) 0:00:14.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:55:21 +0000 (0:00:00.030) 0:00:14.113 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:55:21 +0000 (0:00:00.026) 0:00:14.139 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:55:21 +0000 (0:00:00.037) 0:00:14.177 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:55:21 +0000 (0:00:00.032) 0:00:14.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:55:21 +0000 (0:00:00.034) 0:00:14.244 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:55:21 +0000 (0:00:00.032) 0:00:14.277 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:55:22 +0000 (0:00:00.902) 0:00:15.179 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:55:22 +0000 (0:00:00.570) 0:00:15.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:55:23 +0000 (0:00:00.663) 0:00:16.413 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:55:23 +0000 (0:00:00.361) 0:00:16.774 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:55:23 +0000 (0:00:00.029) 0:00:16.803 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:32 Wednesday 01 June 2022 16:55:24 +0000 (0:00:00.812) 0:00:17.616 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:55:24 +0000 (0:00:00.053) 0:00:17.670 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:55:24 +0000 (0:00:00.030) 0:00:17.701 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:55:24 +0000 (0:00:00.035) 0:00:17.737 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:55:25 +0000 (0:00:00.528) 0:00:18.265 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002834", "end": "2022-06-01 12:55:25.262107", "rc": 0, "start": "2022-06-01 12:55:25.259273" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:55:25 +0000 (0:00:00.500) 0:00:18.766 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003322", "end": "2022-06-01 12:55:25.646404", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:55:25.643082" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.380) 0:00:19.146 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.028) 0:00:19.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.031) 0:00:19.205 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.063) 0:00:19.269 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.034) 0:00:19.303 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.116) 0:00:19.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.036) 0:00:19.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.041) 0:00:19.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.041) 0:00:19.540 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.035) 0:00:19.575 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.037) 0:00:19.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.029) 0:00:19.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.032) 0:00:19.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.028) 0:00:19.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.029) 0:00:19.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.045) 0:00:19.779 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.031) 0:00:19.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.037) 0:00:19.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:55:26 +0000 (0:00:00.031) 0:00:19.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.034) 0:00:19.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.036) 0:00:19.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.034) 0:00:19.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102520.5301216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102520.5301216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102520.5301216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.379) 0:00:20.364 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.036) 0:00:20.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.037) 0:00:20.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.030) 0:00:20.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.030) 0:00:20.499 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.035) 0:00:20.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.029) 0:00:20.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.028) 0:00:20.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.029) 0:00:20.622 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.038) 0:00:20.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.027) 0:00:20.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.027) 0:00:20.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.030) 0:00:20.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.031) 0:00:20.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.032) 0:00:20.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.039) 0:00:20.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:55:27 +0000 (0:00:00.033) 0:00:20.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:20.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:20.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:20.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:20.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:21.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:21.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:21.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.030) 0:00:21.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.028) 0:00:21.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.063) 0:00:21.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.314 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.031) 0:00:21.346 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.375 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.027) 0:00:21.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.431 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.028) 0:00:21.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.490 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.032) 0:00:21.522 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.031) 0:00:21.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.030) 0:00:21.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.029) 0:00:21.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.033) 0:00:21.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.030) 0:00:21.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.030) 0:00:21.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.030) 0:00:21.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.032) 0:00:21.799 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.030) 0:00:21.829 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:34 Wednesday 01 June 2022 16:55:28 +0000 (0:00:00.030) 0:00:21.860 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.058) 0:00:21.918 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.045) 0:00:21.963 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.517) 0:00:22.481 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.070) 0:00:22.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.029) 0:00:22.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.027) 0:00:22.609 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.060) 0:00:22.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.025) 0:00:22.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.030) 0:00:22.725 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.032) 0:00:22.758 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.034) 0:00:22.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.031) 0:00:22.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.031) 0:00:22.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:55:29 +0000 (0:00:00.030) 0:00:22.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:55:30 +0000 (0:00:00.034) 0:00:22.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:55:30 +0000 (0:00:00.045) 0:00:22.965 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:55:30 +0000 (0:00:00.027) 0:00:22.993 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:55:31 +0000 (0:00:01.061) 0:00:24.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:55:31 +0000 (0:00:00.031) 0:00:24.087 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:55:31 +0000 (0:00:00.066) 0:00:24.153 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:55:31 +0000 (0:00:00.036) 0:00:24.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:55:31 +0000 (0:00:00.033) 0:00:24.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:55:31 +0000 (0:00:00.034) 0:00:24.257 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:55:31 +0000 (0:00:00.028) 0:00:24.286 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:55:32 +0000 (0:00:00.703) 0:00:24.989 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:55:32 +0000 (0:00:00.550) 0:00:25.540 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:55:33 +0000 (0:00:00.670) 0:00:26.211 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:55:33 +0000 (0:00:00.366) 0:00:26.577 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:55:33 +0000 (0:00:00.028) 0:00:26.606 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:44 Wednesday 01 June 2022 16:55:34 +0000 (0:00:00.869) 0:00:27.475 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:55:34 +0000 (0:00:00.054) 0:00:27.529 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:55:34 +0000 (0:00:00.029) 0:00:27.559 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:55:34 +0000 (0:00:00.038) 0:00:27.598 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:55:35 +0000 (0:00:00.385) 0:00:27.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003224", "end": "2022-06-01 12:55:34.850086", "rc": 0, "start": "2022-06-01 12:55:34.846862" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:55:35 +0000 (0:00:00.372) 0:00:28.356 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002672", "end": "2022-06-01 12:55:35.219811", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:55:35.217139" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:55:35 +0000 (0:00:00.365) 0:00:28.721 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:55:35 +0000 (0:00:00.028) 0:00:28.750 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:55:35 +0000 (0:00:00.030) 0:00:28.781 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:55:35 +0000 (0:00:00.060) 0:00:28.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:55:35 +0000 (0:00:00.034) 0:00:28.876 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.113) 0:00:28.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.066) 0:00:29.056 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.046) 0:00:29.102 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.039) 0:00:29.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.035) 0:00:29.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.035) 0:00:29.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.028) 0:00:29.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.030) 0:00:29.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.028) 0:00:29.302 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.029) 0:00:29.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.046) 0:00:29.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.032) 0:00:29.410 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.035) 0:00:29.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.032) 0:00:29.479 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.030) 0:00:29.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.037) 0:00:29.546 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:55:36 +0000 (0:00:00.037) 0:00:29.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102520.5301216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102520.5301216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102520.5301216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.386) 0:00:29.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.037) 0:00:30.007 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.035) 0:00:30.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.032) 0:00:30.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.028) 0:00:30.103 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.033) 0:00:30.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.028) 0:00:30.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.028) 0:00:30.223 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.037) 0:00:30.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.033) 0:00:30.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.030) 0:00:30.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.039) 0:00:30.452 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.034) 0:00:30.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.031) 0:00:30.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.030) 0:00:30.577 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.031) 0:00:30.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.033) 0:00:30.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.031) 0:00:30.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.030) 0:00:30.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.031) 0:00:30.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.036) 0:00:30.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.033) 0:00:30.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:55:37 +0000 (0:00:00.029) 0:00:30.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.030) 0:00:30.923 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.031) 0:00:30.954 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.028) 0:00:30.983 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.031) 0:00:31.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.043 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.028) 0:00:31.101 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.032) 0:00:31.134 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.030) 0:00:31.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.030) 0:00:31.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.027) 0:00:31.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.027) 0:00:31.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.030) 0:00:31.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:46 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.029) 0:00:31.457 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.107) 0:00:31.564 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:55:38 +0000 (0:00:00.042) 0:00:31.607 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.526) 0:00:32.134 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.071) 0:00:32.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.029) 0:00:32.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.029) 0:00:32.264 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.059) 0:00:32.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.026) 0:00:32.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.028) 0:00:32.378 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.031) 0:00:32.409 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "mount_point": "/opt/test1", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.035) 0:00:32.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.029) 0:00:32.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.030) 0:00:32.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.031) 0:00:32.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.029) 0:00:32.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.043) 0:00:32.609 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:55:39 +0000 (0:00:00.027) 0:00:32.637 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:55:41 +0000 (0:00:01.322) 0:00:33.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:55:41 +0000 (0:00:00.030) 0:00:33.989 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:55:41 +0000 (0:00:00.027) 0:00:34.017 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:55:41 +0000 (0:00:00.036) 0:00:34.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:55:41 +0000 (0:00:00.033) 0:00:34.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:55:41 +0000 (0:00:00.034) 0:00:34.121 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:55:41 +0000 (0:00:00.383) 0:00:34.505 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:55:42 +0000 (0:00:00.631) 0:00:35.136 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:55:42 +0000 (0:00:00.027) 0:00:35.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:55:42 +0000 (0:00:00.680) 0:00:35.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:55:43 +0000 (0:00:00.365) 0:00:36.210 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:55:43 +0000 (0:00:00.027) 0:00:36.237 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:57 Wednesday 01 June 2022 16:55:44 +0000 (0:00:00.865) 0:00:37.103 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:55:44 +0000 (0:00:00.062) 0:00:37.165 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:55:44 +0000 (0:00:00.032) 0:00:37.197 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=f12c81b6-b0d9-4cbd-b1d7-eebce10d0078", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:55:44 +0000 (0:00:00.039) 0:00:37.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:55:44 +0000 (0:00:00.376) 0:00:37.613 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002849", "end": "2022-06-01 12:55:44.479093", "rc": 0, "start": "2022-06-01 12:55:44.476244" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.369) 0:00:37.982 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003310", "end": "2022-06-01 12:55:44.852407", "failed_when_result": false, "rc": 1, "start": "2022-06-01 12:55:44.849097" } STDERR: cat: /etc/crypttab: No such file or directory MSG: non-zero return code TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.373) 0:00:38.355 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.028) 0:00:38.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.029) 0:00:38.414 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.059) 0:00:38.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.035) 0:00:38.509 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.112) 0:00:38.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.033) 0:00:38.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.038) 0:00:38.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.029) 0:00:38.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.035) 0:00:38.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.032) 0:00:38.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.030) 0:00:38.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.034) 0:00:38.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:55:45 +0000 (0:00:00.029) 0:00:38.886 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.033) 0:00:38.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.045) 0:00:38.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.027) 0:00:38.993 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.038) 0:00:39.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.029) 0:00:39.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.030) 0:00:39.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.035) 0:00:39.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.024) 0:00:39.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102540.3801215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102540.3801215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102540.3801215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.381) 0:00:39.534 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.038) 0:00:39.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.075) 0:00:39.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.033) 0:00:39.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.030) 0:00:39.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.025) 0:00:39.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.029) 0:00:39.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.029) 0:00:39.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.035) 0:00:39.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.025) 0:00:39.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:55:46 +0000 (0:00:00.028) 0:00:39.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.028) 0:00:39.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.033) 0:00:39.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.032) 0:00:39.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.032) 0:00:40.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.039) 0:00:40.053 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.034) 0:00:40.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.029) 0:00:40.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.028) 0:00:40.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.031) 0:00:40.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.033) 0:00:40.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.029) 0:00:40.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.029) 0:00:40.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.028) 0:00:40.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.031) 0:00:40.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.029) 0:00:40.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.512 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.031) 0:00:40.544 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.033) 0:00:40.577 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.029) 0:00:40.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.028) 0:00:40.635 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.028) 0:00:40.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.695 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.033) 0:00:40.728 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.034) 0:00:40.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.033) 0:00:40.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.035) 0:00:40.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:55:47 +0000 (0:00:00.030) 0:00:40.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:55:48 +0000 (0:00:00.030) 0:00:40.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:55:48 +0000 (0:00:00.033) 0:00:40.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:55:48 +0000 (0:00:00.029) 0:00:40.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:55:48 +0000 (0:00:00.029) 0:00:41.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:55:48 +0000 (0:00:00.028) 0:00:41.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=201 changed=4 unreachable=0 failed=0 skipped=177 rescued=0 ignored=0 Wednesday 01 June 2022 16:55:48 +0000 (0:00:00.014) 0:00:41.060 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.10s /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk_scsi_generated.yml:3 --------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.78s /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml:2 ------------------------ linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : get required packages ---------------------- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.63s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:55:48 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:55:50 +0000 (0:00:01.247) 0:00:01.269 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_include_vars_from_parent.yml *********************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:1 Wednesday 01 June 2022 16:55:50 +0000 (0:00:00.011) 0:00:01.281 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [create var file in caller that can override the one in called role] ****** task path: /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:3 Wednesday 01 June 2022 16:55:51 +0000 (0:00:01.106) 0:00:02.388 ******** changed: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat-9.1) => { "ansible_loop_var": "item", "changed": true, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat-9.1.yml", "gid": 0, "group": "root", "item": "RedHat-9.1", "md5sum": "5a57da448a1d752b982858b38aab344d", "mode": "0600", "owner": "root", "size": 23, "src": "/root/.ansible/tmp/ansible-tmp-1654102551.33-70173-124781113425872/source", "state": "file", "uid": 0 } changed: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat-9) => { "ansible_loop_var": "item", "changed": true, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat-9.yml", "gid": 0, "group": "root", "item": "RedHat-9", "md5sum": "5a57da448a1d752b982858b38aab344d", "mode": "0600", "owner": "root", "size": 23, "src": "/root/.ansible/tmp/ansible-tmp-1654102551.8-70173-187588638075315/source", "state": "file", "uid": 0 } changed: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat_9.1) => { "ansible_loop_var": "item", "changed": true, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat_9.1.yml", "gid": 0, "group": "root", "item": "RedHat_9.1", "md5sum": "5a57da448a1d752b982858b38aab344d", "mode": "0600", "owner": "root", "size": 23, "src": "/root/.ansible/tmp/ansible-tmp-1654102552.07-70173-214957671710616/source", "state": "file", "uid": 0 } changed: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat_9) => { "ansible_loop_var": "item", "changed": true, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat_9.yml", "gid": 0, "group": "root", "item": "RedHat_9", "md5sum": "5a57da448a1d752b982858b38aab344d", "mode": "0600", "owner": "root", "size": 23, "src": "/root/.ansible/tmp/ansible-tmp-1654102552.31-70173-8048189796504/source", "state": "file", "uid": 0 } changed: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat) => { "ansible_loop_var": "item", "changed": true, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat.yml", "gid": 0, "group": "root", "item": "RedHat", "md5sum": "5a57da448a1d752b982858b38aab344d", "mode": "0600", "owner": "root", "size": 23, "src": "/root/.ansible/tmp/ansible-tmp-1654102552.57-70173-129022268068191/source", "state": "file", "uid": 0 } TASK [include_role : {{ roletoinclude }}] ************************************** task path: /tmp/tmp7247_7fr/tests/roles/caller/tasks/main.yml:4 Wednesday 01 June 2022 16:55:52 +0000 (0:00:01.553) 0:00:03.941 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:55:52 +0000 (0:00:00.042) 0:00:03.983 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:55:53 +0000 (0:00:00.159) 0:00:04.143 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:55:53 +0000 (0:00:00.503) 0:00:04.646 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:55:53 +0000 (0:00:00.054) 0:00:04.701 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:55:53 +0000 (0:00:00.022) 0:00:04.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:55:53 +0000 (0:00:00.021) 0:00:04.745 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:55:53 +0000 (0:00:00.170) 0:00:04.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:55:53 +0000 (0:00:00.018) 0:00:04.934 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:55:54 +0000 (0:00:00.999) 0:00:05.934 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:55:54 +0000 (0:00:00.025) 0:00:05.960 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:55:54 +0000 (0:00:00.023) 0:00:05.983 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:55:55 +0000 (0:00:00.663) 0:00:06.647 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:55:55 +0000 (0:00:00.042) 0:00:06.689 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:55:55 +0000 (0:00:00.020) 0:00:06.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:55:55 +0000 (0:00:00.022) 0:00:06.732 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:55:55 +0000 (0:00:00.021) 0:00:06.754 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:55:56 +0000 (0:00:00.801) 0:00:07.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:55:58 +0000 (0:00:01.797) 0:00:09.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.042) 0:00:09.395 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.026) 0:00:09.421 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.505) 0:00:09.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.029) 0:00:09.957 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.025) 0:00:09.982 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.030) 0:00:10.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.032) 0:00:10.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.030) 0:00:10.076 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:55:58 +0000 (0:00:00.027) 0:00:10.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:55:59 +0000 (0:00:00.029) 0:00:10.133 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:55:59 +0000 (0:00:00.025) 0:00:10.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:55:59 +0000 (0:00:00.025) 0:00:10.184 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:55:59 +0000 (0:00:00.363) 0:00:10.547 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:55:59 +0000 (0:00:00.027) 0:00:10.575 ******** ok: [/cache/rhel-x.qcow2] TASK [caller : assert] ********************************************************* task path: /tmp/tmp7247_7fr/tests/roles/caller/tasks/main.yml:7 Wednesday 01 June 2022 16:56:00 +0000 (0:00:00.824) 0:00:11.399 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=24 changed=1 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 Wednesday 01 June 2022 16:56:00 +0000 (0:00:00.018) 0:00:11.418 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 create var file in caller that can override the one in called role ------ 1.55s /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:3 ------------------- set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:1 ------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.50s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.36s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : Set platform/version specific variables ---- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 include_role : {{ roletoinclude }} -------------------------------------- 0.04s /tmp/tmp7247_7fr/tests/roles/caller/tasks/main.yml:4 -------------------------- linux-system-roles.storage : enable copr repositories if needed --------- 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : set the list of pools for test verification --- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 linux-system-roles.storage : set the list of volumes for test verification --- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 linux-system-roles.storage : show blivet_output ------------------------- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:56:01 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:56:02 +0000 (0:00:01.288) 0:00:01.312 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_include_vars_from_parent_nvme_generated.yml ******************** 2 plays in /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:56:02 +0000 (0:00:00.014) 0:00:01.327 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:56:03 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:56:04 +0000 (0:00:01.262) 0:00:01.286 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_include_vars_from_parent_scsi_generated.yml ******************** 2 plays in /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent_scsi_generated.yml:3 Wednesday 01 June 2022 16:56:04 +0000 (0:00:00.013) 0:00:01.300 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent_scsi_generated.yml:7 Wednesday 01 June 2022 16:56:05 +0000 (0:00:01.067) 0:00:02.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:1 Wednesday 01 June 2022 16:56:05 +0000 (0:00:00.026) 0:00:02.393 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [create var file in caller that can override the one in called role] ****** task path: /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:3 Wednesday 01 June 2022 16:56:06 +0000 (0:00:00.827) 0:00:03.221 ******** ok: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat-9.1) => { "ansible_loop_var": "item", "changed": false, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat-9.1.yml", "gid": 0, "group": "root", "item": "RedHat-9.1", "mode": "0600", "owner": "root", "path": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat-9.1.yml", "size": 23, "state": "file", "uid": 0 } ok: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat-9) => { "ansible_loop_var": "item", "changed": false, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat-9.yml", "gid": 0, "group": "root", "item": "RedHat-9", "mode": "0600", "owner": "root", "path": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat-9.yml", "size": 23, "state": "file", "uid": 0 } ok: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat_9.1) => { "ansible_loop_var": "item", "changed": false, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat_9.1.yml", "gid": 0, "group": "root", "item": "RedHat_9.1", "mode": "0600", "owner": "root", "path": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat_9.1.yml", "size": 23, "state": "file", "uid": 0 } ok: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat_9) => { "ansible_loop_var": "item", "changed": false, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat_9.yml", "gid": 0, "group": "root", "item": "RedHat_9", "mode": "0600", "owner": "root", "path": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat_9.yml", "size": 23, "state": "file", "uid": 0 } ok: [/cache/rhel-x.qcow2 -> localhost] => (item=RedHat) => { "ansible_loop_var": "item", "changed": false, "checksum": "870b2314d3f4184a363b31373f07abb444f26444", "dest": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat.yml", "gid": 0, "group": "root", "item": "RedHat", "mode": "0600", "owner": "root", "path": "/tmp/tmp7247_7fr/tests/roles/caller/vars/RedHat.yml", "size": 23, "state": "file", "uid": 0 } TASK [include_role : {{ roletoinclude }}] ************************************** task path: /tmp/tmp7247_7fr/tests/roles/caller/tasks/main.yml:4 Wednesday 01 June 2022 16:56:07 +0000 (0:00:01.534) 0:00:04.755 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:56:07 +0000 (0:00:00.048) 0:00:04.803 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:56:08 +0000 (0:00:00.166) 0:00:04.969 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:56:08 +0000 (0:00:00.477) 0:00:05.446 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:56:08 +0000 (0:00:00.053) 0:00:05.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:56:08 +0000 (0:00:00.022) 0:00:05.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:56:08 +0000 (0:00:00.022) 0:00:05.545 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:56:08 +0000 (0:00:00.172) 0:00:05.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:56:08 +0000 (0:00:00.019) 0:00:05.737 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:56:09 +0000 (0:00:01.034) 0:00:06.772 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:56:09 +0000 (0:00:00.026) 0:00:06.798 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:56:09 +0000 (0:00:00.023) 0:00:06.822 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:56:10 +0000 (0:00:00.661) 0:00:07.484 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:56:10 +0000 (0:00:00.041) 0:00:07.525 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:56:10 +0000 (0:00:00.020) 0:00:07.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:56:10 +0000 (0:00:00.023) 0:00:07.569 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:56:10 +0000 (0:00:00.021) 0:00:07.591 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:56:11 +0000 (0:00:00.794) 0:00:08.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:56:13 +0000 (0:00:01.845) 0:00:10.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:56:13 +0000 (0:00:00.044) 0:00:10.275 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:56:13 +0000 (0:00:00.027) 0:00:10.303 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:56:13 +0000 (0:00:00.509) 0:00:10.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:56:13 +0000 (0:00:00.027) 0:00:10.840 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:56:13 +0000 (0:00:00.026) 0:00:10.866 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:56:13 +0000 (0:00:00.032) 0:00:10.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.033) 0:00:10.933 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.032) 0:00:10.965 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.028) 0:00:10.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.028) 0:00:11.023 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.026) 0:00:11.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.032) 0:00:11.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.359) 0:00:11.442 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:56:14 +0000 (0:00:00.028) 0:00:11.471 ******** ok: [/cache/rhel-x.qcow2] TASK [caller : assert] ********************************************************* task path: /tmp/tmp7247_7fr/tests/roles/caller/tasks/main.yml:7 Wednesday 01 June 2022 16:56:15 +0000 (0:00:00.812) 0:00:12.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=26 changed=0 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 Wednesday 01 June 2022 16:56:15 +0000 (0:00:00.017) 0:00:12.301 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 create var file in caller that can override the one in called role ------ 1.53s /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:3 ------------------- set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent_scsi_generated.yml:3 ---- linux-system-roles.storage : make sure blivet is available -------------- 1.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 0.83s /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml:1 ------------------- linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.48s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.36s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : Set platform/version specific variables ---- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- include_role : {{ roletoinclude }} -------------------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/caller/tasks/main.yml:4 -------------------------- linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 linux-system-roles.storage : enable copr repositories if needed --------- 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : set the list of pools for test verification --- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 16:56:16 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 16:56:17 +0000 (0:00:01.239) 0:00:01.262 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_luks.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:2 Wednesday 01 June 2022 16:56:17 +0000 (0:00:00.075) 0:00:01.337 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:11 Wednesday 01 June 2022 16:56:18 +0000 (0:00:01.033) 0:00:02.371 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:56:18 +0000 (0:00:00.036) 0:00:02.408 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:56:18 +0000 (0:00:00.157) 0:00:02.566 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:56:19 +0000 (0:00:00.522) 0:00:03.088 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:56:19 +0000 (0:00:00.075) 0:00:03.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:56:19 +0000 (0:00:00.022) 0:00:03.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:56:19 +0000 (0:00:00.021) 0:00:03.209 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:56:19 +0000 (0:00:00.189) 0:00:03.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:56:19 +0000 (0:00:00.017) 0:00:03.416 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:56:20 +0000 (0:00:00.996) 0:00:04.413 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:56:20 +0000 (0:00:00.047) 0:00:04.460 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:56:20 +0000 (0:00:00.044) 0:00:04.504 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:56:21 +0000 (0:00:00.664) 0:00:05.169 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:56:21 +0000 (0:00:00.080) 0:00:05.250 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:56:21 +0000 (0:00:00.021) 0:00:05.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:56:21 +0000 (0:00:00.021) 0:00:05.292 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:56:21 +0000 (0:00:00.019) 0:00:05.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:56:22 +0000 (0:00:00.792) 0:00:06.105 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:56:24 +0000 (0:00:01.804) 0:00:07.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.044) 0:00:07.953 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.027) 0:00:07.980 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.558) 0:00:08.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.028) 0:00:08.567 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.026) 0:00:08.594 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.030) 0:00:08.624 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.032) 0:00:08.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.033) 0:00:08.691 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.027) 0:00:08.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.028) 0:00:08.747 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.026) 0:00:08.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:56:24 +0000 (0:00:00.027) 0:00:08.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:56:25 +0000 (0:00:00.485) 0:00:09.286 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:56:25 +0000 (0:00:00.028) 0:00:09.315 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:14 Wednesday 01 June 2022 16:56:26 +0000 (0:00:00.814) 0:00:10.129 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 16:56:26 +0000 (0:00:00.044) 0:00:10.174 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 16:56:26 +0000 (0:00:00.523) 0:00:10.698 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 16:56:26 +0000 (0:00:00.036) 0:00:10.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 16:56:26 +0000 (0:00:00.029) 0:00:10.763 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:24 Wednesday 01 June 2022 16:56:26 +0000 (0:00:00.031) 0:00:10.795 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:56:26 +0000 (0:00:00.047) 0:00:10.843 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:56:27 +0000 (0:00:00.043) 0:00:10.887 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:56:27 +0000 (0:00:00.513) 0:00:11.401 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:56:27 +0000 (0:00:00.068) 0:00:11.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:56:27 +0000 (0:00:00.032) 0:00:11.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:56:27 +0000 (0:00:00.032) 0:00:11.534 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:56:27 +0000 (0:00:00.060) 0:00:11.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:56:27 +0000 (0:00:00.026) 0:00:11.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:56:28 +0000 (0:00:00.904) 0:00:12.525 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:56:28 +0000 (0:00:00.032) 0:00:12.558 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:56:28 +0000 (0:00:00.036) 0:00:12.595 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:56:29 +0000 (0:00:01.024) 0:00:13.619 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:56:29 +0000 (0:00:00.053) 0:00:13.673 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:56:29 +0000 (0:00:00.026) 0:00:13.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:56:29 +0000 (0:00:00.029) 0:00:13.729 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:56:29 +0000 (0:00:00.026) 0:00:13.755 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "rc": 0, "results": [ "Installed: cryptsetup-2.4.3-4.el9.x86_64" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:56:31 +0000 (0:00:01.222) 0:00:14.977 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:56:32 +0000 (0:00:01.664) 0:00:16.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:56:32 +0000 (0:00:00.050) 0:00:16.692 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:56:32 +0000 (0:00:00.029) 0:00:16.722 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:56:33 +0000 (0:00:01.091) 0:00:17.813 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:56:33 +0000 (0:00:00.038) 0:00:17.852 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:40 Wednesday 01 June 2022 16:56:33 +0000 (0:00:00.028) 0:00:17.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:46 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.038) 0:00:17.919 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:54 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.034) 0:00:17.954 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.050) 0:00:18.004 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.043) 0:00:18.047 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.516) 0:00:18.564 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.071) 0:00:18.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.030) 0:00:18.666 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.029) 0:00:18.695 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.060) 0:00:18.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:56:34 +0000 (0:00:00.066) 0:00:18.822 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:56:35 +0000 (0:00:00.885) 0:00:19.708 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:56:35 +0000 (0:00:00.034) 0:00:19.742 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:56:35 +0000 (0:00:00.036) 0:00:19.779 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:56:36 +0000 (0:00:01.028) 0:00:20.807 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:56:36 +0000 (0:00:00.052) 0:00:20.860 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:56:37 +0000 (0:00:00.027) 0:00:20.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:56:37 +0000 (0:00:00.030) 0:00:20.918 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:56:37 +0000 (0:00:00.027) 0:00:20.946 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:56:37 +0000 (0:00:00.892) 0:00:21.838 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:56:39 +0000 (0:00:01.718) 0:00:23.557 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:56:39 +0000 (0:00:00.044) 0:00:23.602 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:56:39 +0000 (0:00:00.027) 0:00:23.629 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:56:54 +0000 (0:00:14.903) 0:00:38.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:56:54 +0000 (0:00:00.028) 0:00:38.561 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:56:54 +0000 (0:00:00.029) 0:00:38.590 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:56:54 +0000 (0:00:00.038) 0:00:38.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:56:54 +0000 (0:00:00.033) 0:00:38.662 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:56:54 +0000 (0:00:00.034) 0:00:38.696 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:56:54 +0000 (0:00:00.027) 0:00:38.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:56:55 +0000 (0:00:00.959) 0:00:39.683 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:56:56 +0000 (0:00:00.566) 0:00:40.249 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:56:57 +0000 (0:00:00.684) 0:00:40.934 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:56:57 +0000 (0:00:00.374) 0:00:41.308 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-ce428036-b556-4c9b-a176-fa3f62d11e08', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "password": "-", "state": "present" } } MSG: line added and ownership, perms or SE linux context changed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:56:57 +0000 (0:00:00.497) 0:00:41.806 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:66 Wednesday 01 June 2022 16:56:58 +0000 (0:00:00.829) 0:00:42.636 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:56:58 +0000 (0:00:00.051) 0:00:42.688 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:56:58 +0000 (0:00:00.030) 0:00:42.718 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:56:58 +0000 (0:00:00.040) 0:00:42.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "size": "10G", "type": "crypt", "uuid": "ec933a4c-a34d-4ed6-a248-2bc1e32211c7" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ce428036-b556-4c9b-a176-fa3f62d11e08" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:56:59 +0000 (0:00:00.467) 0:00:43.226 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002603", "end": "2022-06-01 12:56:59.242813", "rc": 0, "start": "2022-06-01 12:56:59.240210" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:56:59 +0000 (0:00:00.513) 0:00:43.739 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003231", "end": "2022-06-01 12:56:59.620939", "failed_when_result": false, "rc": 0, "start": "2022-06-01 12:56:59.617708" } STDOUT: luks-ce428036-b556-4c9b-a176-fa3f62d11e08 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.377) 0:00:44.117 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.028) 0:00:44.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.029) 0:00:44.175 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.063) 0:00:44.239 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.034) 0:00:44.273 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.117) 0:00:44.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.034) 0:00:44.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "ec933a4c-a34d-4ed6-a248-2bc1e32211c7" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "ec933a4c-a34d-4ed6-a248-2bc1e32211c7" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.040) 0:00:44.466 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.037) 0:00:44.504 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.079) 0:00:44.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.037) 0:00:44.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.029) 0:00:44.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.028) 0:00:44.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.028) 0:00:44.709 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.028) 0:00:44.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.045) 0:00:44.783 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.033) 0:00:44.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.034) 0:00:44.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:57:00 +0000 (0:00:00.028) 0:00:44.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.029) 0:00:44.909 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.036) 0:00:44.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.038) 0:00:44.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102613.8061216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102613.8061216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102613.8061216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.363) 0:00:45.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.038) 0:00:45.385 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.035) 0:00:45.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.032) 0:00:45.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.028) 0:00:45.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:57:01 +0000 (0:00:00.035) 0:00:45.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102613.9541216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102613.9541216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 11002, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102613.9541216, "nlink": 1, "path": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.374) 0:00:45.891 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012447", "end": "2022-06-01 12:57:01.767949", "rc": 0, "start": "2022-06-01 12:57:01.755502" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: ce428036-b556-4c9b-a176-fa3f62d11e08 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 9 Memory: 906462 Threads: 4 Salt: b4 ae 73 d4 e4 fa 5b e2 cc 2d 2f cb 8a 88 32 8e fd c3 33 28 a8 f4 27 ab dc 1d 6b d6 6f a3 f6 34 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 97090 Salt: 5b 0b e6 95 63 1b f9 41 e6 33 20 75 d7 7b e7 dc 63 67 d1 4e 78 f8 ef e0 da a3 4a ac 24 72 aa 23 Digest: 17 8d 65 00 bd 12 73 db 5b 6d 4b a6 1c 14 73 97 1d 97 18 c3 64 31 fa b5 40 66 b1 a4 f2 95 78 fa TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.372) 0:00:46.263 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.039) 0:00:46.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.036) 0:00:46.339 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.037) 0:00:46.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.038) 0:00:46.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.037) 0:00:46.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.031) 0:00:46.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.030) 0:00:46.514 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-ce428036-b556-4c9b-a176-fa3f62d11e08 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.037) 0:00:46.552 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.038) 0:00:46.591 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.037) 0:00:46.628 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.036) 0:00:46.665 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.036) 0:00:46.701 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.031) 0:00:46.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.034) 0:00:46.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.036) 0:00:46.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.031) 0:00:46.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:57:02 +0000 (0:00:00.029) 0:00:46.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:46.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.030) 0:00:46.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.030) 0:00:46.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.033) 0:00:46.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.030) 0:00:47.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.030) 0:00:47.051 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.033) 0:00:47.084 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.030) 0:00:47.115 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:47.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.031) 0:00:47.176 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:47.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:47.235 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.032) 0:00:47.268 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.031) 0:00:47.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:47.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.031) 0:00:47.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.030) 0:00:47.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:47.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.027) 0:00:47.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:47.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.029) 0:00:47.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.068) 0:00:47.577 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.030) 0:00:47.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 16:57:03 +0000 (0:00:00.031) 0:00:47.639 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:72 Wednesday 01 June 2022 16:57:04 +0000 (0:00:00.470) 0:00:48.109 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:57:04 +0000 (0:00:00.048) 0:00:48.157 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:57:04 +0000 (0:00:00.044) 0:00:48.202 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:57:04 +0000 (0:00:00.506) 0:00:48.708 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:57:04 +0000 (0:00:00.068) 0:00:48.777 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:57:04 +0000 (0:00:00.030) 0:00:48.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:57:04 +0000 (0:00:00.030) 0:00:48.838 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:57:05 +0000 (0:00:00.059) 0:00:48.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:57:05 +0000 (0:00:00.023) 0:00:48.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:57:05 +0000 (0:00:00.902) 0:00:49.824 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:57:05 +0000 (0:00:00.033) 0:00:49.858 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:57:06 +0000 (0:00:00.036) 0:00:49.894 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:57:07 +0000 (0:00:01.089) 0:00:50.984 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:57:07 +0000 (0:00:00.056) 0:00:51.041 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:57:07 +0000 (0:00:00.028) 0:00:51.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:57:07 +0000 (0:00:00.030) 0:00:51.100 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:57:07 +0000 (0:00:00.027) 0:00:51.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:57:08 +0000 (0:00:00.887) 0:00:52.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:57:09 +0000 (0:00:01.691) 0:00:53.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:57:09 +0000 (0:00:00.046) 0:00:53.753 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:57:09 +0000 (0:00:00.028) 0:00:53.781 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-ce428036-b556-4c9b-a176-fa3f62d11e08' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:57:10 +0000 (0:00:01.084) 0:00:54.865 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10720641024, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-ce428036-b556-4c9b-a176-fa3f62d11e08' in safe mode due to encryption removal"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.041) 0:00:54.907 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:87 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.028) 0:00:54.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:93 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.035) 0:00:54.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.034) 0:00:55.005 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102623.6151216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102623.6151216, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102623.6151216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3507496878", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.372) 0:00:55.378 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:104 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.040) 0:00:55.419 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.056) 0:00:55.476 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:57:11 +0000 (0:00:00.045) 0:00:55.521 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:57:12 +0000 (0:00:00.518) 0:00:56.040 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:57:12 +0000 (0:00:00.071) 0:00:56.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:57:12 +0000 (0:00:00.032) 0:00:56.144 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:57:12 +0000 (0:00:00.031) 0:00:56.175 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:57:12 +0000 (0:00:00.064) 0:00:56.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:57:12 +0000 (0:00:00.026) 0:00:56.266 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:57:13 +0000 (0:00:00.919) 0:00:57.186 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:57:13 +0000 (0:00:00.034) 0:00:57.220 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:57:13 +0000 (0:00:00.036) 0:00:57.257 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:57:14 +0000 (0:00:01.074) 0:00:58.331 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:57:14 +0000 (0:00:00.053) 0:00:58.385 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:57:14 +0000 (0:00:00.026) 0:00:58.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:57:14 +0000 (0:00:00.027) 0:00:58.440 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:57:14 +0000 (0:00:00.030) 0:00:58.470 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:57:15 +0000 (0:00:00.873) 0:00:59.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:57:17 +0000 (0:00:01.775) 0:01:01.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:57:17 +0000 (0:00:00.047) 0:01:01.166 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:57:17 +0000 (0:00:00.028) 0:01:01.195 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:57:18 +0000 (0:00:01.546) 0:01:02.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:57:18 +0000 (0:00:00.031) 0:01:02.773 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:57:18 +0000 (0:00:00.030) 0:01:02.803 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:57:18 +0000 (0:00:00.042) 0:01:02.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:57:18 +0000 (0:00:00.034) 0:01:02.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:57:19 +0000 (0:00:00.036) 0:01:02.916 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ce428036-b556-4c9b-a176-fa3f62d11e08" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:57:19 +0000 (0:00:00.383) 0:01:03.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:57:20 +0000 (0:00:00.679) 0:01:03.979 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:57:20 +0000 (0:00:00.420) 0:01:04.399 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:57:21 +0000 (0:00:00.668) 0:01:05.068 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102619.6201215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e538445f17125b9148a8fcaa6fdb02629faf7080", "ctime": 1654102617.3031216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21708, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102617.2971215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "407190641", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:57:21 +0000 (0:00:00.399) 0:01:05.467 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-ce428036-b556-4c9b-a176-fa3f62d11e08', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:57:21 +0000 (0:00:00.403) 0:01:05.871 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:117 Wednesday 01 June 2022 16:57:22 +0000 (0:00:00.846) 0:01:06.717 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:57:22 +0000 (0:00:00.056) 0:01:06.774 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:57:22 +0000 (0:00:00.031) 0:01:06.805 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:57:22 +0000 (0:00:00.037) 0:01:06.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1df545b2-f954-4a0b-ab0b-b83e0cddaaed" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:57:23 +0000 (0:00:00.384) 0:01:07.227 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002474", "end": "2022-06-01 12:57:23.097264", "rc": 0, "start": "2022-06-01 12:57:23.094790" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:57:23 +0000 (0:00:00.367) 0:01:07.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002926", "end": "2022-06-01 12:57:23.485731", "failed_when_result": false, "rc": 0, "start": "2022-06-01 12:57:23.482805" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.387) 0:01:07.982 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.027) 0:01:08.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.029) 0:01:08.040 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.059) 0:01:08.100 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.035) 0:01:08.136 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.113) 0:01:08.250 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.036) 0:01:08.286 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "1df545b2-f954-4a0b-ab0b-b83e0cddaaed" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "1df545b2-f954-4a0b-ab0b-b83e0cddaaed" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.040) 0:01:08.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.035) 0:01:08.362 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.035) 0:01:08.397 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.035) 0:01:08.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.031) 0:01:08.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.029) 0:01:08.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.028) 0:01:08.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.029) 0:01:08.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.044) 0:01:08.596 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.033) 0:01:08.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.036) 0:01:08.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.028) 0:01:08.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.029) 0:01:08.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.034) 0:01:08.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:57:24 +0000 (0:00:00.035) 0:01:08.795 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102638.1911216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102638.1911216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102638.1911216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.388) 0:01:09.184 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.037) 0:01:09.221 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.035) 0:01:09.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.034) 0:01:09.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.027) 0:01:09.320 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.033) 0:01:09.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.029) 0:01:09.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.030) 0:01:09.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.029) 0:01:09.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.038) 0:01:09.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.031) 0:01:09.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.031) 0:01:09.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.030) 0:01:09.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.030) 0:01:09.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.030) 0:01:09.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.037) 0:01:09.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.035) 0:01:09.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.029) 0:01:09.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.029) 0:01:09.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.029) 0:01:09.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:57:25 +0000 (0:00:00.032) 0:01:09.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.066) 0:01:09.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:09.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.029) 0:01:09.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:09.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.029) 0:01:10.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:10.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.032) 0:01:10.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.029) 0:01:10.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.028) 0:01:10.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:10.169 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.033) 0:01:10.202 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:10.232 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.033) 0:01:10.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:10.296 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.028) 0:01:10.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.028) 0:01:10.352 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.034) 0:01:10.387 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.033) 0:01:10.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.033) 0:01:10.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:10.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.036) 0:01:10.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.031) 0:01:10.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.031) 0:01:10.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.031) 0:01:10.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.032) 0:01:10.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.030) 0:01:10.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.031) 0:01:10.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 16:57:26 +0000 (0:00:00.029) 0:01:10.739 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:123 Wednesday 01 June 2022 16:57:27 +0000 (0:00:00.390) 0:01:11.130 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:57:27 +0000 (0:00:00.043) 0:01:11.173 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:57:27 +0000 (0:00:00.045) 0:01:11.218 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:57:27 +0000 (0:00:00.525) 0:01:11.744 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:57:27 +0000 (0:00:00.078) 0:01:11.822 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:57:27 +0000 (0:00:00.045) 0:01:11.868 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:57:28 +0000 (0:00:00.036) 0:01:11.904 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:57:28 +0000 (0:00:00.063) 0:01:11.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:57:28 +0000 (0:00:00.026) 0:01:11.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:57:29 +0000 (0:00:00.900) 0:01:12.895 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:57:29 +0000 (0:00:00.093) 0:01:12.989 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:57:29 +0000 (0:00:00.037) 0:01:13.026 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:57:30 +0000 (0:00:01.047) 0:01:14.073 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:57:30 +0000 (0:00:00.052) 0:01:14.126 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:57:30 +0000 (0:00:00.029) 0:01:14.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:57:30 +0000 (0:00:00.029) 0:01:14.185 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:57:30 +0000 (0:00:00.027) 0:01:14.212 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:57:31 +0000 (0:00:00.807) 0:01:15.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service": { "name": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:57:33 +0000 (0:00:01.934) 0:01:16.954 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:57:33 +0000 (0:00:00.049) 0:01:17.004 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dce428036\x2db556\x2d4c9b\x2da176\x2dfa3f62d11e08.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "name": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target dev-sda.device systemd-journald.socket systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2dce428036\\\\x2db556\\\\x2d4c9b\\\\x2da176\\\\x2dfa3f62d11e08.target\" umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-ce428036-b556-4c9b-a176-fa3f62d11e08", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-ce428036-b556-4c9b-a176-fa3f62d11e08 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-ce428036-b556-4c9b-a176-fa3f62d11e08 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-ce428036-b556-4c9b-a176-fa3f62d11e08 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-ce428036-b556-4c9b-a176-fa3f62d11e08 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dce428036\\\\x2db556\\\\x2d4c9b\\\\x2da176\\\\x2dfa3f62d11e08.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 12:57:20 EDT", "StateChangeTimestampMonotonic": "2229573551", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dce428036\\\\x2db556\\\\x2d4c9b\\\\x2da176\\\\x2dfa3f62d11e08.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:57:33 +0000 (0:00:00.716) 0:01:17.720 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:57:34 +0000 (0:00:01.061) 0:01:18.782 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:57:34 +0000 (0:00:00.042) 0:01:18.825 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dce428036\x2db556\x2d4c9b\x2da176\x2dfa3f62d11e08.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "name": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dce428036\\x2db556\\x2d4c9b\\x2da176\\x2dfa3f62d11e08.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dce428036\\\\x2db556\\\\x2d4c9b\\\\x2da176\\\\x2dfa3f62d11e08.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:138 Wednesday 01 June 2022 16:57:35 +0000 (0:00:00.688) 0:01:19.514 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:144 Wednesday 01 June 2022 16:57:35 +0000 (0:00:00.036) 0:01:19.550 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 16:57:35 +0000 (0:00:00.036) 0:01:19.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102646.6341214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102646.6341214, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102646.6341214, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3330698955", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.403) 0:01:19.989 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:155 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.042) 0:01:20.032 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.050) 0:01:20.083 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.044) 0:01:20.127 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.528) 0:01:20.656 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.073) 0:01:20.729 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.031) 0:01:20.761 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:57:36 +0000 (0:00:00.032) 0:01:20.793 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:57:37 +0000 (0:00:00.113) 0:01:20.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:57:37 +0000 (0:00:00.026) 0:01:20.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:57:37 +0000 (0:00:00.881) 0:01:21.815 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:57:37 +0000 (0:00:00.034) 0:01:21.849 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:57:38 +0000 (0:00:00.036) 0:01:21.886 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:57:39 +0000 (0:00:01.024) 0:01:22.910 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:57:39 +0000 (0:00:00.055) 0:01:22.966 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:57:39 +0000 (0:00:00.029) 0:01:22.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:57:39 +0000 (0:00:00.030) 0:01:23.025 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:57:39 +0000 (0:00:00.028) 0:01:23.053 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:57:40 +0000 (0:00:00.850) 0:01:23.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:57:41 +0000 (0:00:01.692) 0:01:25.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:57:41 +0000 (0:00:00.046) 0:01:25.644 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:57:41 +0000 (0:00:00.027) 0:01:25.672 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:57:58 +0000 (0:00:16.884) 0:01:42.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:57:58 +0000 (0:00:00.033) 0:01:42.589 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:57:58 +0000 (0:00:00.028) 0:01:42.618 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:57:58 +0000 (0:00:00.042) 0:01:42.661 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:57:58 +0000 (0:00:00.038) 0:01:42.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:57:58 +0000 (0:00:00.038) 0:01:42.739 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=1df545b2-f954-4a0b-ab0b-b83e0cddaaed" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:57:59 +0000 (0:00:00.404) 0:01:43.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:57:59 +0000 (0:00:00.691) 0:01:43.835 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:58:00 +0000 (0:00:00.440) 0:01:44.276 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:58:01 +0000 (0:00:00.658) 0:01:44.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102643.4851215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102641.3631215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792387, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654102641.3621216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3455655604", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:58:01 +0000 (0:00:00.377) 0:01:45.312 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-da746014-8b22-462e-bf8f-f86e22d0f24d', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:58:01 +0000 (0:00:00.453) 0:01:45.766 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:168 Wednesday 01 June 2022 16:58:02 +0000 (0:00:00.838) 0:01:46.605 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:58:02 +0000 (0:00:00.051) 0:01:46.656 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:58:02 +0000 (0:00:00.032) 0:01:46.688 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:58:02 +0000 (0:00:00.041) 0:01:46.730 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "size": "10G", "type": "crypt", "uuid": "a1d4d285-84aa-4b6b-9ebb-0ad9c4dbf4c3" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "da746014-8b22-462e-bf8f-f86e22d0f24d" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:58:03 +0000 (0:00:00.388) 0:01:47.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002389", "end": "2022-06-01 12:58:02.988578", "rc": 0, "start": "2022-06-01 12:58:02.986189" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:58:03 +0000 (0:00:00.366) 0:01:47.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002742", "end": "2022-06-01 12:58:03.359368", "failed_when_result": false, "rc": 0, "start": "2022-06-01 12:58:03.356626" } STDOUT: luks-da746014-8b22-462e-bf8f-f86e22d0f24d /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:58:03 +0000 (0:00:00.372) 0:01:47.858 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.029) 0:01:47.887 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.031) 0:01:47.919 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.063) 0:01:47.983 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.034) 0:01:48.018 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.112) 0:01:48.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.035) 0:01:48.166 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "a1d4d285-84aa-4b6b-9ebb-0ad9c4dbf4c3" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "a1d4d285-84aa-4b6b-9ebb-0ad9c4dbf4c3" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.045) 0:01:48.211 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.038) 0:01:48.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.035) 0:01:48.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.037) 0:01:48.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.030) 0:01:48.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.030) 0:01:48.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.031) 0:01:48.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.030) 0:01:48.444 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.045) 0:01:48.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.033) 0:01:48.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.035) 0:01:48.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.029) 0:01:48.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.034) 0:01:48.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.037) 0:01:48.658 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:58:04 +0000 (0:00:00.036) 0:01:48.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102677.8221216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102677.8221216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102677.8221216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:58:05 +0000 (0:00:00.388) 0:01:49.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:58:05 +0000 (0:00:00.038) 0:01:49.121 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:58:05 +0000 (0:00:00.038) 0:01:49.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:58:05 +0000 (0:00:00.033) 0:01:49.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:58:05 +0000 (0:00:00.030) 0:01:49.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:58:05 +0000 (0:00:00.036) 0:01:49.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102677.9661214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102677.9661214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 11287, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102677.9661214, "nlink": 1, "path": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:58:05 +0000 (0:00:00.383) 0:01:49.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012876", "end": "2022-06-01 12:58:05.537405", "rc": 0, "start": "2022-06-01 12:58:05.524529" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: da746014-8b22-462e-bf8f-f86e22d0f24d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 11 Memory: 906462 Threads: 4 Salt: 6f 5e b4 85 e4 97 38 9d 5a 38 61 35 2a ab dc 9e 91 0a 8e 84 03 ba cb 19 9c ba 76 e6 e8 2b 62 5d AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 98254 Salt: 3c d0 5e 89 85 53 4f 88 4f 2b ba a9 cf 44 7a f7 e1 c0 75 a0 78 a5 b8 1f 4e a2 98 41 f1 8f 1c 2a Digest: 48 95 5f e5 fc 8b 0a 4d 7d a3 1d 33 76 e4 e1 1f 48 86 bc 15 21 25 b9 fa 4d c1 79 da e5 1c cf 08 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.393) 0:01:50.038 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.038) 0:01:50.077 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.036) 0:01:50.113 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.038) 0:01:50.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.037) 0:01:50.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.029) 0:01:50.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.038) 0:01:50.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.033) 0:01:50.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-da746014-8b22-462e-bf8f-f86e22d0f24d /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.039) 0:01:50.332 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.037) 0:01:50.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.040) 0:01:50.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.040) 0:01:50.450 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.041) 0:01:50.491 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.035) 0:01:50.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.035) 0:01:50.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.032) 0:01:50.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.029) 0:01:50.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.031) 0:01:50.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.033) 0:01:50.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.032) 0:01:50.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.031) 0:01:50.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.030) 0:01:50.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.030) 0:01:50.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.031) 0:01:50.847 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:58:06 +0000 (0:00:00.035) 0:01:50.882 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.031) 0:01:50.913 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.032) 0:01:50.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.032) 0:01:50.978 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.031) 0:01:51.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.031) 0:01:51.040 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.038) 0:01:51.078 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.033) 0:01:51.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.031) 0:01:51.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.031) 0:01:51.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.028) 0:01:51.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.031) 0:01:51.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.034) 0:01:51.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.030) 0:01:51.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.030) 0:01:51.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.032) 0:01:51.363 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.032) 0:01:51.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:176 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.029) 0:01:51.425 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.052) 0:01:51.478 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:58:07 +0000 (0:00:00.046) 0:01:51.524 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:58:08 +0000 (0:00:00.553) 0:01:52.077 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:58:08 +0000 (0:00:00.133) 0:01:52.211 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:58:08 +0000 (0:00:00.033) 0:01:52.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:58:08 +0000 (0:00:00.032) 0:01:52.276 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:58:08 +0000 (0:00:00.062) 0:01:52.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:58:08 +0000 (0:00:00.026) 0:01:52.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:58:09 +0000 (0:00:00.896) 0:01:53.262 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:58:09 +0000 (0:00:00.039) 0:01:53.302 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:58:09 +0000 (0:00:00.033) 0:01:53.336 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:58:10 +0000 (0:00:01.106) 0:01:54.443 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:58:10 +0000 (0:00:00.058) 0:01:54.501 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:58:10 +0000 (0:00:00.031) 0:01:54.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:58:10 +0000 (0:00:00.031) 0:01:54.565 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:58:10 +0000 (0:00:00.029) 0:01:54.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:58:11 +0000 (0:00:00.858) 0:01:55.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:58:13 +0000 (0:00:01.688) 0:01:57.141 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:58:13 +0000 (0:00:00.048) 0:01:57.190 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:58:13 +0000 (0:00:00.029) 0:01:57.219 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:58:14 +0000 (0:00:01.192) 0:01:58.411 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'partition', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:58:14 +0000 (0:00:00.043) 0:01:58.455 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:197 Wednesday 01 June 2022 16:58:14 +0000 (0:00:00.028) 0:01:58.484 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:203 Wednesday 01 June 2022 16:58:14 +0000 (0:00:00.035) 0:01:58.519 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:210 Wednesday 01 June 2022 16:58:14 +0000 (0:00:00.035) 0:01:58.555 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:58:14 +0000 (0:00:00.053) 0:01:58.608 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:58:14 +0000 (0:00:00.044) 0:01:58.653 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:58:15 +0000 (0:00:00.543) 0:01:59.197 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:58:15 +0000 (0:00:00.073) 0:01:59.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:58:15 +0000 (0:00:00.032) 0:01:59.302 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:58:15 +0000 (0:00:00.032) 0:01:59.335 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:58:15 +0000 (0:00:00.069) 0:01:59.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:58:15 +0000 (0:00:00.033) 0:01:59.437 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:58:16 +0000 (0:00:00.906) 0:02:00.343 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:58:16 +0000 (0:00:00.040) 0:02:00.384 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:58:16 +0000 (0:00:00.033) 0:02:00.418 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:58:17 +0000 (0:00:01.074) 0:02:01.492 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:58:17 +0000 (0:00:00.060) 0:02:01.552 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:58:17 +0000 (0:00:00.030) 0:02:01.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:58:17 +0000 (0:00:00.031) 0:02:01.614 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:58:17 +0000 (0:00:00.028) 0:02:01.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:58:18 +0000 (0:00:00.838) 0:02:02.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:58:20 +0000 (0:00:01.692) 0:02:04.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:58:20 +0000 (0:00:00.046) 0:02:04.221 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:58:20 +0000 (0:00:00.028) 0:02:04.250 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-2119d808-5078-476c-bf8b-293e4accbd6a", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:58:29 +0000 (0:00:09.601) 0:02:13.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:58:30 +0000 (0:00:00.044) 0:02:13.896 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:58:30 +0000 (0:00:00.037) 0:02:13.934 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-2119d808-5078-476c-bf8b-293e4accbd6a", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:58:30 +0000 (0:00:00.044) 0:02:13.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:58:30 +0000 (0:00:00.039) 0:02:14.018 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:58:30 +0000 (0:00:00.035) 0:02:14.053 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-da746014-8b22-462e-bf8f-f86e22d0f24d" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:58:30 +0000 (0:00:00.423) 0:02:14.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:58:31 +0000 (0:00:00.679) 0:02:15.156 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:58:31 +0000 (0:00:00.397) 0:02:15.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:58:32 +0000 (0:00:00.657) 0:02:16.212 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102683.3581214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8aa9105cf42bfb1bb9cb52baac68ae6e4d19fd36", "ctime": 1654102681.2031214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417711, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102681.2021215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "230182720", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:58:32 +0000 (0:00:00.383) 0:02:16.595 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-da746014-8b22-462e-bf8f-f86e22d0f24d', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-2119d808-5078-476c-bf8b-293e4accbd6a", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:58:33 +0000 (0:00:00.800) 0:02:17.395 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:227 Wednesday 01 June 2022 16:58:34 +0000 (0:00:00.841) 0:02:18.236 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:58:34 +0000 (0:00:00.048) 0:02:18.285 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:58:34 +0000 (0:00:00.039) 0:02:18.325 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:58:34 +0000 (0:00:00.029) 0:02:18.355 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "size": "10G", "type": "crypt", "uuid": "8668910d-7f52-4478-9161-518a9e8620bb" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "2119d808-5078-476c-bf8b-293e4accbd6a" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:58:34 +0000 (0:00:00.420) 0:02:18.775 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002681", "end": "2022-06-01 12:58:34.653411", "rc": 0, "start": "2022-06-01 12:58:34.650730" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.385) 0:02:19.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002747", "end": "2022-06-01 12:58:35.052368", "failed_when_result": false, "rc": 0, "start": "2022-06-01 12:58:35.049621" } STDOUT: luks-2119d808-5078-476c-bf8b-293e4accbd6a /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.390) 0:02:19.552 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.063) 0:02:19.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.029) 0:02:19.645 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.062) 0:02:19.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.033) 0:02:19.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.030) 0:02:19.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.029) 0:02:19.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.028) 0:02:19.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:58:35 +0000 (0:00:00.031) 0:02:19.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.032) 0:02:19.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.031) 0:02:19.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.029) 0:02:19.955 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.054) 0:02:20.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.029) 0:02:20.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.073) 0:02:20.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.031) 0:02:20.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.031) 0:02:20.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.032) 0:02:20.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.030) 0:02:20.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.031) 0:02:20.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.032) 0:02:20.302 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.056) 0:02:20.359 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.042) 0:02:20.401 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.062) 0:02:20.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.036) 0:02:20.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.037) 0:02:20.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.033) 0:02:20.571 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.029) 0:02:20.601 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.063) 0:02:20.665 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.041) 0:02:20.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.030) 0:02:20.737 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.062) 0:02:20.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:58:36 +0000 (0:00:00.034) 0:02:20.835 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.116) 0:02:20.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.035) 0:02:20.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "8668910d-7f52-4478-9161-518a9e8620bb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "8668910d-7f52-4478-9161-518a9e8620bb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.042) 0:02:21.030 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.039) 0:02:21.070 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.034) 0:02:21.104 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.036) 0:02:21.141 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.032) 0:02:21.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.041) 0:02:21.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.030) 0:02:21.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.034) 0:02:21.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.052) 0:02:21.333 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.036) 0:02:21.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.035) 0:02:21.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.028) 0:02:21.434 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.032) 0:02:21.467 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.039) 0:02:21.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:58:37 +0000 (0:00:00.037) 0:02:21.544 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102709.1281216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102709.1281216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 11450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102709.1281216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:58:38 +0000 (0:00:00.392) 0:02:21.937 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:58:38 +0000 (0:00:00.036) 0:02:21.973 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:58:38 +0000 (0:00:00.034) 0:02:22.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:58:38 +0000 (0:00:00.033) 0:02:22.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:58:38 +0000 (0:00:00.032) 0:02:22.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:58:38 +0000 (0:00:00.035) 0:02:22.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102709.2721214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102709.2721214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 11495, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102709.2721214, "nlink": 1, "path": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:58:38 +0000 (0:00:00.435) 0:02:22.546 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.014113", "end": "2022-06-01 12:58:38.442723", "rc": 0, "start": "2022-06-01 12:58:38.428610" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 2119d808-5078-476c-bf8b-293e4accbd6a Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 10 Memory: 906462 Threads: 4 Salt: 8e 66 7c fb 34 f0 7d a2 1f 0d ca c8 03 0e b2 fd 7c fd 4e 9b 65 17 3c ef a6 d0 4e f1 6d b4 47 44 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 96093 Salt: 31 07 ee d6 14 f1 03 d4 74 08 55 67 86 ab 47 1d 04 9a 3f 75 5b 08 d3 42 16 38 d7 9f 2c a5 26 52 Digest: 09 0d da 46 5b 71 d1 ec 5d 6d d0 24 9e 0a 55 d8 30 0b 3d 98 3c de d0 f2 65 4e e3 e1 28 4a 36 42 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.402) 0:02:22.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.039) 0:02:22.988 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.037) 0:02:23.025 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.038) 0:02:23.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.036) 0:02:23.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.031) 0:02:23.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.032) 0:02:23.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.032) 0:02:23.196 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-2119d808-5078-476c-bf8b-293e4accbd6a /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.038) 0:02:23.235 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.036) 0:02:23.271 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.038) 0:02:23.310 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.038) 0:02:23.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.039) 0:02:23.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.031) 0:02:23.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.030) 0:02:23.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.029) 0:02:23.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.030) 0:02:23.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.030) 0:02:23.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.036) 0:02:23.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.036) 0:02:23.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.032) 0:02:23.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.031) 0:02:23.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.033) 0:02:23.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.031) 0:02:23.741 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.035) 0:02:23.776 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.033) 0:02:23.810 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.032) 0:02:23.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:58:39 +0000 (0:00:00.030) 0:02:23.874 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.030) 0:02:23.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.030) 0:02:23.935 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.039) 0:02:23.974 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.035) 0:02:24.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.032) 0:02:24.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.032) 0:02:24.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.031) 0:02:24.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.030) 0:02:24.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.033) 0:02:24.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.032) 0:02:24.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.031) 0:02:24.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.031) 0:02:24.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.031) 0:02:24.296 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.031) 0:02:24.327 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.030) 0:02:24.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.031) 0:02:24.389 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:233 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.410) 0:02:24.800 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:58:40 +0000 (0:00:00.052) 0:02:24.852 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:58:41 +0000 (0:00:00.047) 0:02:24.900 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:58:41 +0000 (0:00:00.530) 0:02:25.430 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:58:41 +0000 (0:00:00.073) 0:02:25.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:58:41 +0000 (0:00:00.031) 0:02:25.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:58:41 +0000 (0:00:00.031) 0:02:25.567 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:58:41 +0000 (0:00:00.064) 0:02:25.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:58:41 +0000 (0:00:00.027) 0:02:25.659 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:58:42 +0000 (0:00:00.923) 0:02:26.583 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:58:42 +0000 (0:00:00.044) 0:02:26.627 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:58:42 +0000 (0:00:00.038) 0:02:26.666 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:58:43 +0000 (0:00:01.213) 0:02:27.879 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:58:44 +0000 (0:00:00.056) 0:02:27.936 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:58:44 +0000 (0:00:00.030) 0:02:27.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:58:44 +0000 (0:00:00.032) 0:02:27.999 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:58:44 +0000 (0:00:00.029) 0:02:28.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:58:45 +0000 (0:00:00.885) 0:02:28.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service": { "name": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:58:46 +0000 (0:00:01.731) 0:02:30.646 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:58:46 +0000 (0:00:00.047) 0:02:30.694 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dda746014\x2d8b22\x2d462e\x2dbf8f\x2df86e22d0f24d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "name": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target systemd-udevd-kernel.socket systemd-journald.socket dev-sda.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2dda746014\\\\x2d8b22\\\\x2d462e\\\\x2dbf8f\\\\x2df86e22d0f24d.target\" umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-da746014-8b22-462e-bf8f-f86e22d0f24d", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-da746014-8b22-462e-bf8f-f86e22d0f24d /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-da746014-8b22-462e-bf8f-f86e22d0f24d /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-da746014-8b22-462e-bf8f-f86e22d0f24d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-da746014-8b22-462e-bf8f-f86e22d0f24d ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dda746014\\\\x2d8b22\\\\x2d462e\\\\x2dbf8f\\\\x2df86e22d0f24d.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 12:58:31 EDT", "StateChangeTimestampMonotonic": "2300728396", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dda746014\\\\x2d8b22\\\\x2d462e\\\\x2dbf8f\\\\x2df86e22d0f24d.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:58:47 +0000 (0:00:00.716) 0:02:31.411 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-2119d808-5078-476c-bf8b-293e4accbd6a' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:58:48 +0000 (0:00:01.231) 0:02:32.642 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'partition', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-2119d808-5078-476c-bf8b-293e4accbd6a' in safe mode due to encryption removal"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:58:48 +0000 (0:00:00.044) 0:02:32.686 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dda746014\x2d8b22\x2d462e\x2dbf8f\x2df86e22d0f24d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "name": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dda746014\\x2d8b22\\x2d462e\\x2dbf8f\\x2df86e22d0f24d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dda746014\\\\x2d8b22\\\\x2d462e\\\\x2dbf8f\\\\x2df86e22d0f24d.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:252 Wednesday 01 June 2022 16:58:49 +0000 (0:00:00.725) 0:02:33.412 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:258 Wednesday 01 June 2022 16:58:49 +0000 (0:00:00.037) 0:02:33.449 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 16:58:49 +0000 (0:00:00.036) 0:02:33.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102720.2961216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102720.2961216, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102720.2961216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1005927229", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 16:58:49 +0000 (0:00:00.392) 0:02:33.879 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:269 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.038) 0:02:33.917 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.051) 0:02:33.968 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.046) 0:02:34.014 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.515) 0:02:34.530 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.073) 0:02:34.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.032) 0:02:34.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.038) 0:02:34.675 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.123) 0:02:34.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:58:50 +0000 (0:00:00.030) 0:02:34.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:58:51 +0000 (0:00:00.897) 0:02:35.727 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:58:51 +0000 (0:00:00.039) 0:02:35.766 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:58:51 +0000 (0:00:00.034) 0:02:35.800 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:58:53 +0000 (0:00:01.170) 0:02:36.971 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:58:53 +0000 (0:00:00.056) 0:02:37.027 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:58:53 +0000 (0:00:00.029) 0:02:37.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:58:53 +0000 (0:00:00.031) 0:02:37.088 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:58:53 +0000 (0:00:00.028) 0:02:37.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:58:54 +0000 (0:00:00.842) 0:02:37.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service": { "name": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:58:55 +0000 (0:00:01.717) 0:02:39.677 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:58:55 +0000 (0:00:00.046) 0:02:39.723 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d2119d808\x2d5078\x2d476c\x2dbf8b\x2d293e4accbd6a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "name": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" dev-sda1.device cryptsetup-pre.target systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-2119d808-5078-476c-bf8b-293e4accbd6a", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2119d808-5078-476c-bf8b-293e4accbd6a /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2119d808-5078-476c-bf8b-293e4accbd6a /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2119d808-5078-476c-bf8b-293e4accbd6a ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2119d808-5078-476c-bf8b-293e4accbd6a ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 12:58:48 EDT", "StateChangeTimestampMonotonic": "2317903648", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:58:56 +0000 (0:00:00.719) 0:02:40.443 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-2119d808-5078-476c-bf8b-293e4accbd6a", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:58:58 +0000 (0:00:01.715) 0:02:42.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:58:58 +0000 (0:00:00.031) 0:02:42.190 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d2119d808\x2d5078\x2d476c\x2dbf8b\x2d293e4accbd6a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "name": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 12:58:48 EDT", "StateChangeTimestampMonotonic": "2317903648", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:58:58 +0000 (0:00:00.695) 0:02:42.885 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-2119d808-5078-476c-bf8b-293e4accbd6a", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:58:59 +0000 (0:00:00.043) 0:02:42.928 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:58:59 +0000 (0:00:00.037) 0:02:42.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:58:59 +0000 (0:00:00.033) 0:02:42.999 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2119d808-5078-476c-bf8b-293e4accbd6a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:58:59 +0000 (0:00:00.402) 0:02:43.402 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:59:00 +0000 (0:00:00.670) 0:02:44.072 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:59:00 +0000 (0:00:00.415) 0:02:44.487 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:59:01 +0000 (0:00:00.643) 0:02:45.131 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102715.0511215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "974b85bd1d8e7622d2a256b6f40f72c22b7a7565", "ctime": 1654102712.8841214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792387, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102712.8841214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "3455655605", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:59:01 +0000 (0:00:00.382) 0:02:45.514 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-2119d808-5078-476c-bf8b-293e4accbd6a', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-2119d808-5078-476c-bf8b-293e4accbd6a", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:59:02 +0000 (0:00:00.408) 0:02:45.922 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:286 Wednesday 01 June 2022 16:59:02 +0000 (0:00:00.831) 0:02:46.754 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:59:02 +0000 (0:00:00.057) 0:02:46.811 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:59:02 +0000 (0:00:00.041) 0:02:46.853 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:59:03 +0000 (0:00:00.033) 0:02:46.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:59:03 +0000 (0:00:00.412) 0:02:47.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002519", "end": "2022-06-01 12:59:03.164691", "rc": 0, "start": "2022-06-01 12:59:03.162172" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:59:03 +0000 (0:00:00.364) 0:02:47.663 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003155", "end": "2022-06-01 12:59:03.540724", "failed_when_result": false, "rc": 0, "start": "2022-06-01 12:59:03.537569" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.377) 0:02:48.041 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.063) 0:02:48.104 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.029) 0:02:48.134 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.063) 0:02:48.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.033) 0:02:48.232 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.029) 0:02:48.262 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.027) 0:02:48.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.031) 0:02:48.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.030) 0:02:48.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.032) 0:02:48.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.033) 0:02:48.417 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.027) 0:02:48.445 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.055) 0:02:48.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.031) 0:02:48.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.033) 0:02:48.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.031) 0:02:48.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.031) 0:02:48.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.030) 0:02:48.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.029) 0:02:48.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.028) 0:02:48.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.093) 0:02:48.810 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:59:04 +0000 (0:00:00.060) 0:02:48.871 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'cache_size': 0, u'_mount_id': u'UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.043) 0:02:48.914 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.064) 0:02:48.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.036) 0:02:49.015 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.029) 0:02:49.044 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.029) 0:02:49.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.030) 0:02:49.104 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.071) 0:02:49.175 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'cache_size': 0, u'_mount_id': u'UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.043) 0:02:49.218 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.031) 0:02:49.250 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.061) 0:02:49.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.036) 0:02:49.348 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.120) 0:02:49.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.037) 0:02:49.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592102, "block_size": 4096, "block_total": 2618624, "block_used": 26522, "device": "/dev/sda1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617249792, "size_total": 10725883904, "uuid": "2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592102, "block_size": 4096, "block_total": 2618624, "block_used": 26522, "device": "/dev/sda1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617249792, "size_total": 10725883904, "uuid": "2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.043) 0:02:49.550 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.040) 0:02:49.590 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.036) 0:02:49.627 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.038) 0:02:49.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.030) 0:02:49.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.031) 0:02:49.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.033) 0:02:49.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.036) 0:02:49.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.049) 0:02:49.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:59:05 +0000 (0:00:00.034) 0:02:49.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.036) 0:02:49.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.031) 0:02:49.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.032) 0:02:49.982 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.040) 0:02:50.022 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.038) 0:02:50.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102737.5791216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102737.5791216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 11450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102737.5791216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.388) 0:02:50.450 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.039) 0:02:50.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.044) 0:02:50.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.035) 0:02:50.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.030) 0:02:50.600 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.035) 0:02:50.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.029) 0:02:50.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.029) 0:02:50.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.028) 0:02:50.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.035) 0:02:50.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.028) 0:02:50.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.034) 0:02:50.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.031) 0:02:50.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:59:06 +0000 (0:00:00.031) 0:02:50.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.030) 0:02:50.916 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.039) 0:02:50.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.039) 0:02:50.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.088) 0:02:51.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.032) 0:02:51.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.034) 0:02:51.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.032) 0:02:51.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.032) 0:02:51.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.034) 0:02:51.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.502 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.033) 0:02:51.536 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.567 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.030) 0:02:51.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.629 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.033) 0:02:51.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.033) 0:02:51.696 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.036) 0:02:51.733 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.037) 0:02:51.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.037) 0:02:51.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.031) 0:02:51.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:59:07 +0000 (0:00:00.033) 0:02:51.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.030) 0:02:51.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.030) 0:02:51.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.030) 0:02:51.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.031) 0:02:51.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.030) 0:02:52.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.035) 0:02:52.062 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.031) 0:02:52.094 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.028) 0:02:52.123 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.030) 0:02:52.154 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:292 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.375) 0:02:52.529 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.050) 0:02:52.579 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:59:08 +0000 (0:00:00.045) 0:02:52.625 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:59:09 +0000 (0:00:00.561) 0:02:53.187 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:59:09 +0000 (0:00:00.074) 0:02:53.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:59:09 +0000 (0:00:00.031) 0:02:53.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:59:09 +0000 (0:00:00.032) 0:02:53.325 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:59:09 +0000 (0:00:00.062) 0:02:53.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:59:09 +0000 (0:00:00.028) 0:02:53.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:59:10 +0000 (0:00:00.946) 0:02:54.363 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:59:10 +0000 (0:00:00.040) 0:02:54.404 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:59:10 +0000 (0:00:00.035) 0:02:54.439 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:59:11 +0000 (0:00:01.128) 0:02:55.568 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:59:11 +0000 (0:00:00.061) 0:02:55.630 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:59:11 +0000 (0:00:00.031) 0:02:55.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:59:11 +0000 (0:00:00.032) 0:02:55.693 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:59:11 +0000 (0:00:00.028) 0:02:55.722 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:59:12 +0000 (0:00:00.832) 0:02:56.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service": { "name": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:59:14 +0000 (0:00:01.715) 0:02:58.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:59:14 +0000 (0:00:00.049) 0:02:58.320 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d2119d808\x2d5078\x2d476c\x2dbf8b\x2d293e4accbd6a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "name": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket dev-sda1.device systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-2119d808-5078-476c-bf8b-293e4accbd6a", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2119d808-5078-476c-bf8b-293e4accbd6a /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2119d808-5078-476c-bf8b-293e4accbd6a /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2119d808-5078-476c-bf8b-293e4accbd6a ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2119d808-5078-476c-bf8b-293e4accbd6a ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 12:58:48 EDT", "StateChangeTimestampMonotonic": "2317903648", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:59:15 +0000 (0:00:00.713) 0:02:59.033 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:59:16 +0000 (0:00:01.171) 0:03:00.204 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'partition', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:59:16 +0000 (0:00:00.045) 0:03:00.250 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d2119d808\x2d5078\x2d476c\x2dbf8b\x2d293e4accbd6a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "name": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d2119d808\\x2d5078\\x2d476c\\x2dbf8b\\x2d293e4accbd6a.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2119d808\\\\x2d5078\\\\x2d476c\\\\x2dbf8b\\\\x2d293e4accbd6a.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:313 Wednesday 01 June 2022 16:59:17 +0000 (0:00:00.731) 0:03:00.982 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:319 Wednesday 01 June 2022 16:59:17 +0000 (0:00:00.036) 0:03:01.019 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 16:59:17 +0000 (0:00:00.036) 0:03:01.055 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102748.0281215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102748.0281215, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102748.0281215, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2592499726", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 16:59:17 +0000 (0:00:00.389) 0:03:01.444 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:332 Wednesday 01 June 2022 16:59:17 +0000 (0:00:00.037) 0:03:01.482 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_test6vgtqyqrlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:339 Wednesday 01 June 2022 16:59:18 +0000 (0:00:00.546) 0:03:02.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_test6vgtqyqrlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1654102758.2-74786-243306085130886/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:346 Wednesday 01 June 2022 16:59:18 +0000 (0:00:00.837) 0:03:02.867 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.051) 0:03:02.918 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.043) 0:03:02.962 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.535) 0:03:03.497 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.080) 0:03:03.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.035) 0:03:03.613 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.032) 0:03:03.646 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.130) 0:03:03.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:59:19 +0000 (0:00:00.027) 0:03:03.805 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:59:20 +0000 (0:00:00.895) 0:03:04.700 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_test6vgtqyqrlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:59:20 +0000 (0:00:00.039) 0:03:04.740 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:59:20 +0000 (0:00:00.034) 0:03:04.775 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:59:22 +0000 (0:00:01.141) 0:03:05.916 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:59:22 +0000 (0:00:00.056) 0:03:05.972 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:59:22 +0000 (0:00:00.030) 0:03:06.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:59:22 +0000 (0:00:00.031) 0:03:06.035 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:59:22 +0000 (0:00:00.030) 0:03:06.065 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:59:23 +0000 (0:00:00.900) 0:03:06.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:59:24 +0000 (0:00:01.688) 0:03:08.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:59:24 +0000 (0:00:00.049) 0:03:08.704 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:59:24 +0000 (0:00:00.028) 0:03:08.733 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "password": "/tmp/storage_test6vgtqyqrlukskey", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test6vgtqyqrlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 16:59:35 +0000 (0:00:11.118) 0:03:19.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:59:36 +0000 (0:00:00.035) 0:03:19.887 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 16:59:36 +0000 (0:00:00.031) 0:03:19.919 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "password": "/tmp/storage_test6vgtqyqrlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test6vgtqyqrlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 16:59:36 +0000 (0:00:00.042) 0:03:19.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test6vgtqyqrlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 16:59:36 +0000 (0:00:00.037) 0:03:19.999 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 16:59:36 +0000 (0:00:00.038) 0:03:20.037 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=2a39446c-2ab9-4c7f-9f40-a6e4dba3d4e8" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 16:59:36 +0000 (0:00:00.408) 0:03:20.445 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 16:59:37 +0000 (0:00:00.676) 0:03:21.122 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 16:59:37 +0000 (0:00:00.419) 0:03:21.541 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 16:59:38 +0000 (0:00:00.666) 0:03:22.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102743.5401216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102741.4111216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417711, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654102741.4101214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3834412320", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 16:59:38 +0000 (0:00:00.375) 0:03:22.583 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'/tmp/storage_test6vgtqyqrlukskey', u'name': u'luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "password": "/tmp/storage_test6vgtqyqrlukskey", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 16:59:39 +0000 (0:00:00.455) 0:03:23.038 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:363 Wednesday 01 June 2022 16:59:39 +0000 (0:00:00.833) 0:03:23.871 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 16:59:40 +0000 (0:00:00.048) 0:03:23.920 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test6vgtqyqrlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 16:59:40 +0000 (0:00:00.040) 0:03:23.961 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 16:59:40 +0000 (0:00:00.032) 0:03:23.994 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "size": "10G", "type": "crypt", "uuid": "be249fb8-e849-41ef-aeca-bb071683ce6a" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "ee0ed332-a32c-4325-95b8-fd5aa304bfbb" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 16:59:40 +0000 (0:00:00.374) 0:03:24.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002515", "end": "2022-06-01 12:59:40.245299", "rc": 0, "start": "2022-06-01 12:59:40.242784" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 16:59:40 +0000 (0:00:00.378) 0:03:24.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003384", "end": "2022-06-01 12:59:40.613979", "failed_when_result": false, "rc": 0, "start": "2022-06-01 12:59:40.610595" } STDOUT: luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb /dev/sda1 /tmp/storage_test6vgtqyqrlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.374) 0:03:25.122 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.067) 0:03:25.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.031) 0:03:25.221 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.060) 0:03:25.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.031) 0:03:25.313 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.027) 0:03:25.341 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.028) 0:03:25.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.031) 0:03:25.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.030) 0:03:25.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.034) 0:03:25.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.031) 0:03:25.497 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.028) 0:03:25.526 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.056) 0:03:25.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.032) 0:03:25.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.030) 0:03:25.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.031) 0:03:25.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.030) 0:03:25.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.030) 0:03:25.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.031) 0:03:25.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 16:59:41 +0000 (0:00:00.039) 0:03:25.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.080) 0:03:25.891 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.059) 0:03:25.951 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'/tmp/storage_test6vgtqyqrlukskey', u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test6vgtqyqrlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.044) 0:03:25.995 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.064) 0:03:26.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.036) 0:03:26.097 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.029) 0:03:26.126 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.029) 0:03:26.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.032) 0:03:26.187 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.067) 0:03:26.255 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'/tmp/storage_test6vgtqyqrlukskey', u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test6vgtqyqrlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.041) 0:03:26.297 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.032) 0:03:26.329 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.062) 0:03:26.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.042) 0:03:26.434 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.119) 0:03:26.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.036) 0:03:26.591 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "be249fb8-e849-41ef-aeca-bb071683ce6a" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "be249fb8-e849-41ef-aeca-bb071683ce6a" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.042) 0:03:26.633 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.039) 0:03:26.673 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.035) 0:03:26.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.040) 0:03:26.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.035) 0:03:26.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.031) 0:03:26.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.030) 0:03:26.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 16:59:42 +0000 (0:00:00.031) 0:03:26.878 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.047) 0:03:26.926 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.033) 0:03:26.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.035) 0:03:26.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.031) 0:03:27.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.034) 0:03:27.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.037) 0:03:27.099 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.037) 0:03:27.137 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102775.0991216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102775.0991216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 11450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102775.0991216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.388) 0:03:27.526 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.038) 0:03:27.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.034) 0:03:27.598 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.031) 0:03:27.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.030) 0:03:27.660 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 16:59:43 +0000 (0:00:00.033) 0:03:27.693 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102775.2601216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102775.2601216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 11929, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102775.2601216, "nlink": 1, "path": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.384) 0:03:28.078 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.013246", "end": "2022-06-01 12:59:43.970766", "rc": 0, "start": "2022-06-01 12:59:43.957520" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: ee0ed332-a32c-4325-95b8-fd5aa304bfbb Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 10 Memory: 906462 Threads: 4 Salt: 85 0f a1 32 8d 63 5e a2 77 cf 19 f4 57 69 74 ad dc eb 2e 09 9a 89 e7 f3 ae 67 de 11 aa 4b 02 eb AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 97090 Salt: bb 39 ee 25 ce 5f b6 f9 1a c5 6f 0f 02 78 f5 66 97 38 2f 76 d0 89 0c d1 64 94 ed b6 90 d7 a5 c7 Digest: 0a 9f 55 85 84 05 09 8d f7 15 99 ee 8f 0f 4d 81 0e 8e 70 c0 7c f2 da a8 ae 22 44 3d 21 97 fb 4e TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.440) 0:03:28.518 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.039) 0:03:28.557 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.040) 0:03:28.598 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.036) 0:03:28.634 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.035) 0:03:28.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.028) 0:03:28.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.032) 0:03:28.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.028) 0:03:28.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb /dev/sda1 /tmp/storage_test6vgtqyqrlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_test6vgtqyqrlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.036) 0:03:28.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.033) 0:03:28.830 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 16:59:44 +0000 (0:00:00.035) 0:03:28.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.037) 0:03:28.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.042) 0:03:28.945 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.033) 0:03:28.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.032) 0:03:29.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.033) 0:03:29.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.033) 0:03:29.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.035) 0:03:29.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.042) 0:03:29.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.034) 0:03:29.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.034) 0:03:29.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.034) 0:03:29.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.033) 0:03:29.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.033) 0:03:29.326 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.038) 0:03:29.365 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.033) 0:03:29.398 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.031) 0:03:29.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.032) 0:03:29.462 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.033) 0:03:29.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.032) 0:03:29.528 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.039) 0:03:29.567 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.035) 0:03:29.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.032) 0:03:29.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.031) 0:03:29.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.032) 0:03:29.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.034) 0:03:29.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.036) 0:03:29.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.035) 0:03:29.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.034) 0:03:29.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 16:59:45 +0000 (0:00:00.036) 0:03:29.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 16:59:46 +0000 (0:00:00.032) 0:03:29.910 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 16:59:46 +0000 (0:00:00.032) 0:03:29.942 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 16:59:46 +0000 (0:00:00.031) 0:03:29.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:365 Wednesday 01 June 2022 16:59:46 +0000 (0:00:00.033) 0:03:30.007 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "path": "/tmp/storage_test6vgtqyqrlukskey", "state": "absent" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:377 Wednesday 01 June 2022 16:59:46 +0000 (0:00:00.382) 0:03:30.390 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:59:46 +0000 (0:00:00.050) 0:03:30.441 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:59:46 +0000 (0:00:00.047) 0:03:30.488 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:59:47 +0000 (0:00:00.558) 0:03:31.047 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:59:47 +0000 (0:00:00.073) 0:03:31.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:59:47 +0000 (0:00:00.031) 0:03:31.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:59:47 +0000 (0:00:00.031) 0:03:31.184 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:59:47 +0000 (0:00:00.064) 0:03:31.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:59:47 +0000 (0:00:00.027) 0:03:31.276 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:59:48 +0000 (0:00:00.932) 0:03:32.209 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:59:48 +0000 (0:00:00.103) 0:03:32.313 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:59:48 +0000 (0:00:00.035) 0:03:32.348 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:59:49 +0000 (0:00:01.181) 0:03:33.530 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:59:49 +0000 (0:00:00.056) 0:03:33.586 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:59:49 +0000 (0:00:00.028) 0:03:33.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:59:49 +0000 (0:00:00.030) 0:03:33.646 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:59:49 +0000 (0:00:00.028) 0:03:33.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:59:50 +0000 (0:00:00.884) 0:03:34.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:59:52 +0000 (0:00:01.665) 0:03:36.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:59:52 +0000 (0:00:00.049) 0:03:36.274 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:59:52 +0000 (0:00:00.029) 0:03:36.304 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 16:59:53 +0000 (0:00:01.232) 0:03:37.536 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 16:59:53 +0000 (0:00:00.043) 0:03:37.580 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:397 Wednesday 01 June 2022 16:59:53 +0000 (0:00:00.030) 0:03:37.610 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:403 Wednesday 01 June 2022 16:59:53 +0000 (0:00:00.037) 0:03:37.648 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:410 Wednesday 01 June 2022 16:59:53 +0000 (0:00:00.036) 0:03:37.685 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 16:59:53 +0000 (0:00:00.054) 0:03:37.739 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 16:59:53 +0000 (0:00:00.046) 0:03:37.786 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 16:59:54 +0000 (0:00:00.534) 0:03:38.320 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 16:59:54 +0000 (0:00:00.073) 0:03:38.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 16:59:54 +0000 (0:00:00.032) 0:03:38.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 16:59:54 +0000 (0:00:00.032) 0:03:38.458 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 16:59:54 +0000 (0:00:00.064) 0:03:38.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 16:59:54 +0000 (0:00:00.026) 0:03:38.549 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 16:59:55 +0000 (0:00:00.901) 0:03:39.450 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 16:59:55 +0000 (0:00:00.040) 0:03:39.490 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 16:59:55 +0000 (0:00:00.035) 0:03:39.526 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 16:59:56 +0000 (0:00:01.158) 0:03:40.684 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 16:59:56 +0000 (0:00:00.056) 0:03:40.741 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 16:59:56 +0000 (0:00:00.029) 0:03:40.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 16:59:56 +0000 (0:00:00.032) 0:03:40.803 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 16:59:56 +0000 (0:00:00.029) 0:03:40.833 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 16:59:57 +0000 (0:00:00.820) 0:03:41.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 16:59:59 +0000 (0:00:01.661) 0:03:43.315 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 16:59:59 +0000 (0:00:00.054) 0:03:43.370 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 16:59:59 +0000 (0:00:00.036) 0:03:43.406 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:00:07 +0000 (0:00:07.984) 0:03:51.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:00:07 +0000 (0:00:00.032) 0:03:51.424 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:00:07 +0000 (0:00:00.028) 0:03:51.453 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:00:07 +0000 (0:00:00.055) 0:03:51.508 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:00:07 +0000 (0:00:00.042) 0:03:51.551 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:00:07 +0000 (0:00:00.039) 0:03:51.590 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:00:08 +0000 (0:00:00.407) 0:03:51.997 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:00:08 +0000 (0:00:00.661) 0:03:52.658 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:00:09 +0000 (0:00:00.433) 0:03:53.092 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:00:09 +0000 (0:00:00.679) 0:03:53.772 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102780.6131215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c58a0dd49f37b8c37264e490c1c1c2fe4301308b", "ctime": 1654102778.4671216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417712, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102778.4661214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "1416674737", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:00:10 +0000 (0:00:00.397) 0:03:54.169 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-86e9dd83-d809-42e7-9e1a-14301954a2aa', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:00:11 +0000 (0:00:00.720) 0:03:54.889 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:429 Wednesday 01 June 2022 17:00:11 +0000 (0:00:00.927) 0:03:55.817 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:00:11 +0000 (0:00:00.054) 0:03:55.871 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:00:12 +0000 (0:00:00.044) 0:03:55.916 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:00:12 +0000 (0:00:00.033) 0:03:55.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "86e9dd83-d809-42e7-9e1a-14301954a2aa" }, "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "size": "4G", "type": "crypt", "uuid": "6cf8ae09-742d-46f3-aab8-fec1f8e96de7" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:00:12 +0000 (0:00:00.375) 0:03:56.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002692", "end": "2022-06-01 13:00:12.192709", "rc": 0, "start": "2022-06-01 13:00:12.190017" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:00:12 +0000 (0:00:00.376) 0:03:56.701 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.004032", "end": "2022-06-01 13:00:12.592253", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:00:12.588221" } STDOUT: luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:00:13 +0000 (0:00:00.404) 0:03:57.106 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:00:13 +0000 (0:00:00.069) 0:03:57.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:00:13 +0000 (0:00:00.031) 0:03:57.207 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:00:13 +0000 (0:00:00.061) 0:03:57.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:00:13 +0000 (0:00:00.040) 0:03:57.308 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:00:13 +0000 (0:00:00.568) 0:03:57.877 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.045) 0:03:57.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.041) 0:03:57.964 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.041) 0:03:58.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.038) 0:03:58.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.032) 0:03:58.076 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.043) 0:03:58.120 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.060) 0:03:58.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.032) 0:03:58.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.032) 0:03:58.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.031) 0:03:58.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.031) 0:03:58.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.031) 0:03:58.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.034) 0:03:58.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.032) 0:03:58.407 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.032) 0:03:58.440 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.060) 0:03:58.500 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.117) 0:03:58.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.032) 0:03:58.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.031) 0:03:58.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.032) 0:03:58.715 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.065) 0:03:58.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.037) 0:03:58.818 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:00:14 +0000 (0:00:00.036) 0:03:58.855 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.059) 0:03:58.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.052) 0:03:58.967 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.053) 0:03:59.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.032) 0:03:59.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.032) 0:03:59.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.033) 0:03:59.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.034) 0:03:59.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.037) 0:03:59.191 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.065) 0:03:59.257 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.069) 0:03:59.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.033) 0:03:59.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.032) 0:03:59.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.031) 0:03:59.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.031) 0:03:59.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.031) 0:03:59.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.034) 0:03:59.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.032) 0:03:59.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.031) 0:03:59.584 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.030) 0:03:59.615 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.062) 0:03:59.677 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.036) 0:03:59.714 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.122) 0:03:59.837 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:00:15 +0000 (0:00:00.035) 0:03:59.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "6cf8ae09-742d-46f3-aab8-fec1f8e96de7" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "6cf8ae09-742d-46f3-aab8-fec1f8e96de7" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.044) 0:03:59.918 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.038) 0:03:59.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.038) 0:03:59.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.037) 0:04:00.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.031) 0:04:00.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.030) 0:04:00.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.030) 0:04:00.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.031) 0:04:00.158 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.050) 0:04:00.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.037) 0:04:00.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.039) 0:04:00.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.030) 0:04:00.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.032) 0:04:00.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.039) 0:04:00.387 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:00:16 +0000 (0:00:00.048) 0:04:00.435 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102806.6311214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102806.6311214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12177, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102806.6311214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:00:17 +0000 (0:00:00.485) 0:04:00.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:00:17 +0000 (0:00:00.041) 0:04:00.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:00:17 +0000 (0:00:00.038) 0:04:01.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:00:17 +0000 (0:00:00.035) 0:04:01.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:00:17 +0000 (0:00:00.034) 0:04:01.070 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:00:17 +0000 (0:00:00.037) 0:04:01.108 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102806.7971215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102806.7971215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12216, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102806.7971215, "nlink": 1, "path": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:00:17 +0000 (0:00:00.394) 0:04:01.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.011965", "end": "2022-06-01 13:00:17.404550", "rc": 0, "start": "2022-06-01 13:00:17.392585" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 61 5b a5 3c 04 f8 e5 dc 90 c3 d8 a7 c2 6a 07 b5 ae 0f fb 6f MK salt: 1b 73 5c 92 e5 23 4e 6d e1 27 4b 9f d7 3f ce b9 87 56 2b 9f d8 ff ac fe 18 63 06 ae 0b 86 1f 69 MK iterations: 96376 UUID: 86e9dd83-d809-42e7-9e1a-14301954a2aa Key Slot 0: ENABLED Iterations: 1504412 Salt: ed 72 67 56 49 a8 db 50 28 2a 88 f6 42 ea ea d9 69 c7 60 fb 44 b0 80 4f d9 0c 3a 9e 7b 18 98 e1 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.411) 0:04:01.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.041) 0:04:01.955 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.039) 0:04:01.994 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.040) 0:04:02.034 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.037) 0:04:02.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.038) 0:04:02.111 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.038) 0:04:02.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.040) 0:04:02.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.040) 0:04:02.231 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.036) 0:04:02.267 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.040) 0:04:02.307 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.042) 0:04:02.350 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.039) 0:04:02.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.037) 0:04:02.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.033) 0:04:02.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.032) 0:04:02.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.036) 0:04:02.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.033) 0:04:02.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.031) 0:04:02.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.033) 0:04:02.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:00:18 +0000 (0:00:00.030) 0:04:02.659 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.533) 0:04:03.192 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.373) 0:04:03.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.039) 0:04:03.606 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.034) 0:04:03.641 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.030) 0:04:03.672 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.032) 0:04:03.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.032) 0:04:03.737 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.031) 0:04:03.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.032) 0:04:03.801 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.035) 0:04:03.836 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:00:19 +0000 (0:00:00.034) 0:04:03.871 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.042) 0:04:03.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.042600", "end": "2022-06-01 13:00:19.834915", "rc": 0, "start": "2022-06-01 13:00:19.792315" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.427) 0:04:04.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.040) 0:04:04.382 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.041) 0:04:04.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.035) 0:04:04.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.034) 0:04:04.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.035) 0:04:04.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.039) 0:04:04.568 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.040) 0:04:04.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.041) 0:04:04.651 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.031) 0:04:04.682 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:431 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.034) 0:04:04.716 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.064) 0:04:04.781 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:00:20 +0000 (0:00:00.046) 0:04:04.827 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:00:21 +0000 (0:00:00.523) 0:04:05.351 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:00:21 +0000 (0:00:00.149) 0:04:05.501 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:00:21 +0000 (0:00:00.034) 0:04:05.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:00:21 +0000 (0:00:00.034) 0:04:05.569 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:00:21 +0000 (0:00:00.069) 0:04:05.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:00:21 +0000 (0:00:00.028) 0:04:05.667 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:00:22 +0000 (0:00:00.902) 0:04:06.570 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:00:22 +0000 (0:00:00.040) 0:04:06.611 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:00:22 +0000 (0:00:00.036) 0:04:06.647 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:00:24 +0000 (0:00:01.269) 0:04:07.917 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:00:24 +0000 (0:00:00.057) 0:04:07.974 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:00:24 +0000 (0:00:00.027) 0:04:08.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:00:24 +0000 (0:00:00.029) 0:04:08.031 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:00:24 +0000 (0:00:00.029) 0:04:08.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:00:25 +0000 (0:00:00.832) 0:04:08.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service": { "name": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:00:26 +0000 (0:00:01.703) 0:04:10.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:00:26 +0000 (0:00:00.049) 0:04:10.647 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dee0ed332\x2da32c\x2d4325\x2d95b8\x2dfd5aa304bfbb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "name": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket systemd-journald.socket tmp.mount cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" dev-sda1.device -.mount", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2dee0ed332\\\\x2da32c\\\\x2d4325\\\\x2d95b8\\\\x2dfd5aa304bfbb.target\" umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb /dev/sda1 /tmp/storage_test6vgtqyqrlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb /dev/sda1 /tmp/storage_test6vgtqyqrlukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-ee0ed332-a32c-4325-95b8-fd5aa304bfbb ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dee0ed332\\\\x2da32c\\\\x2d4325\\\\x2d95b8\\\\x2dfd5aa304bfbb.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount \"system-systemd\\\\x2dcryptsetup.slice\"", "RequiresMountsFor": "/tmp/storage_test6vgtqyqrlukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:00:09 EDT", "StateChangeTimestampMonotonic": "2398282138", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dee0ed332\\\\x2da32c\\\\x2d4325\\\\x2d95b8\\\\x2dfd5aa304bfbb.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:00:27 +0000 (0:00:00.711) 0:04:11.359 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:00:28 +0000 (0:00:01.333) 0:04:12.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:00:28 +0000 (0:00:00.033) 0:04:12.726 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dee0ed332\x2da32c\x2d4325\x2d95b8\x2dfd5aa304bfbb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "name": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dee0ed332\\x2da32c\\x2d4325\\x2d95b8\\x2dfd5aa304bfbb.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dee0ed332\\\\x2da32c\\\\x2d4325\\\\x2d95b8\\\\x2dfd5aa304bfbb.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:00:29 +0000 (0:00:00.674) 0:04:13.401 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:00:29 +0000 (0:00:00.041) 0:04:13.442 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:00:29 +0000 (0:00:00.040) 0:04:13.483 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:00:29 +0000 (0:00:00.035) 0:04:13.518 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:00:29 +0000 (0:00:00.031) 0:04:13.549 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:00:30 +0000 (0:00:00.648) 0:04:14.197 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:00:30 +0000 (0:00:00.399) 0:04:14.597 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:00:31 +0000 (0:00:00.670) 0:04:15.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102812.5911214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9c0f5107d9926c6bd40aa9fc6ddd0d5b722b017d", "ctime": 1654102810.3791215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8730271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102810.3781216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2747625667", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:00:31 +0000 (0:00:00.393) 0:04:15.662 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:00:31 +0000 (0:00:00.031) 0:04:15.693 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:445 Wednesday 01 June 2022 17:00:32 +0000 (0:00:00.860) 0:04:16.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:451 Wednesday 01 June 2022 17:00:32 +0000 (0:00:00.039) 0:04:16.594 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:00:32 +0000 (0:00:00.065) 0:04:16.659 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:00:32 +0000 (0:00:00.040) 0:04:16.700 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:00:32 +0000 (0:00:00.032) 0:04:16.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "86e9dd83-d809-42e7-9e1a-14301954a2aa" }, "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "size": "4G", "type": "crypt", "uuid": "6cf8ae09-742d-46f3-aab8-fec1f8e96de7" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:00:33 +0000 (0:00:00.403) 0:04:17.136 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002669", "end": "2022-06-01 13:00:33.012992", "rc": 0, "start": "2022-06-01 13:00:33.010323" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:00:33 +0000 (0:00:00.382) 0:04:17.518 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002420", "end": "2022-06-01 13:00:33.387841", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:00:33.385421" } STDOUT: luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.376) 0:04:17.895 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.070) 0:04:17.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.032) 0:04:17.997 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.064) 0:04:18.062 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.042) 0:04:18.104 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.389) 0:04:18.494 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.045) 0:04:18.539 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.050) 0:04:18.589 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.042) 0:04:18.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.042) 0:04:18.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.032) 0:04:18.707 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.048) 0:04:18.755 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.057) 0:04:18.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.033) 0:04:18.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:00:34 +0000 (0:00:00.031) 0:04:18.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.036) 0:04:18.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.097) 0:04:19.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.034) 0:04:19.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.033) 0:04:19.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.032) 0:04:19.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.036) 0:04:19.148 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.064) 0:04:19.213 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.064) 0:04:19.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.031) 0:04:19.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.033) 0:04:19.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.031) 0:04:19.375 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.064) 0:04:19.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.038) 0:04:19.478 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.036) 0:04:19.514 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.061) 0:04:19.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.036) 0:04:19.613 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.036) 0:04:19.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.031) 0:04:19.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.033) 0:04:19.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.031) 0:04:19.745 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.032) 0:04:19.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.032) 0:04:19.811 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:00:35 +0000 (0:00:00.066) 0:04:19.878 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.074) 0:04:19.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.036) 0:04:19.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.033) 0:04:20.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.036) 0:04:20.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.034) 0:04:20.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.032) 0:04:20.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.033) 0:04:20.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.032) 0:04:20.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.033) 0:04:20.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.036) 0:04:20.261 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.065) 0:04:20.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.037) 0:04:20.364 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.124) 0:04:20.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.046) 0:04:20.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "6cf8ae09-742d-46f3-aab8-fec1f8e96de7" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "6cf8ae09-742d-46f3-aab8-fec1f8e96de7" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.044) 0:04:20.580 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.039) 0:04:20.619 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.037) 0:04:20.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.040) 0:04:20.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.032) 0:04:20.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.032) 0:04:20.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.030) 0:04:20.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.032) 0:04:20.825 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:00:36 +0000 (0:00:00.048) 0:04:20.874 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.035) 0:04:20.910 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.039) 0:04:20.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.097) 0:04:21.047 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.033) 0:04:21.081 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.037) 0:04:21.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.037) 0:04:21.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102817.4001215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102806.6311214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12177, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102806.6311214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.411) 0:04:21.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.039) 0:04:21.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.036) 0:04:21.643 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.036) 0:04:21.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.032) 0:04:21.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:00:37 +0000 (0:00:00.041) 0:04:21.755 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102828.1311214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102806.7971215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12216, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102806.7971215, "nlink": 1, "path": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.393) 0:04:22.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.011141", "end": "2022-06-01 13:00:38.032534", "rc": 0, "start": "2022-06-01 13:00:38.021393" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 61 5b a5 3c 04 f8 e5 dc 90 c3 d8 a7 c2 6a 07 b5 ae 0f fb 6f MK salt: 1b 73 5c 92 e5 23 4e 6d e1 27 4b 9f d7 3f ce b9 87 56 2b 9f d8 ff ac fe 18 63 06 ae 0b 86 1f 69 MK iterations: 96376 UUID: 86e9dd83-d809-42e7-9e1a-14301954a2aa Key Slot 0: ENABLED Iterations: 1504412 Salt: ed 72 67 56 49 a8 db 50 28 2a 88 f6 42 ea ea d9 69 c7 60 fb 44 b0 80 4f d9 0c 3a 9e 7b 18 98 e1 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.390) 0:04:22.539 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.042) 0:04:22.582 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.041) 0:04:22.624 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.041) 0:04:22.665 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.041) 0:04:22.706 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.040) 0:04:22.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.034) 0:04:22.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.033) 0:04:22.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:00:38 +0000 (0:00:00.040) 0:04:22.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.037) 0:04:22.893 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.042) 0:04:22.936 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.039) 0:04:22.976 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.040) 0:04:23.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.032) 0:04:23.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.032) 0:04:23.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.031) 0:04:23.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.034) 0:04:23.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.032) 0:04:23.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.033) 0:04:23.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.032) 0:04:23.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.032) 0:04:23.279 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:00:39 +0000 (0:00:00.367) 0:04:23.646 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.386) 0:04:24.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.039) 0:04:24.072 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.036) 0:04:24.108 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.032) 0:04:24.140 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.031) 0:04:24.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.032) 0:04:24.204 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.034) 0:04:24.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.032) 0:04:24.272 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.037) 0:04:24.309 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.033) 0:04:24.343 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.040) 0:04:24.383 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.044190", "end": "2022-06-01 13:00:40.300757", "rc": 0, "start": "2022-06-01 13:00:40.256567" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.424) 0:04:24.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:00:40 +0000 (0:00:00.042) 0:04:24.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.045) 0:04:24.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.033) 0:04:24.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.036) 0:04:24.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.033) 0:04:25.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.033) 0:04:25.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.031) 0:04:25.065 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.030) 0:04:25.096 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.028) 0:04:25.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.033) 0:04:25.158 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:457 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.381) 0:04:25.540 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.058) 0:04:25.598 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:00:41 +0000 (0:00:00.051) 0:04:25.650 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:00:42 +0000 (0:00:00.548) 0:04:26.198 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:00:42 +0000 (0:00:00.076) 0:04:26.274 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:00:42 +0000 (0:00:00.032) 0:04:26.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:00:42 +0000 (0:00:00.031) 0:04:26.338 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:00:42 +0000 (0:00:00.066) 0:04:26.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:00:42 +0000 (0:00:00.032) 0:04:26.437 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:00:43 +0000 (0:00:00.900) 0:04:27.338 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:00:43 +0000 (0:00:00.039) 0:04:27.377 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:00:43 +0000 (0:00:00.036) 0:04:27.413 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:00:44 +0000 (0:00:01.276) 0:04:28.690 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:00:44 +0000 (0:00:00.058) 0:04:28.748 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:00:44 +0000 (0:00:00.030) 0:04:28.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:00:44 +0000 (0:00:00.031) 0:04:28.810 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:00:44 +0000 (0:00:00.028) 0:04:28.839 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:00:45 +0000 (0:00:00.868) 0:04:29.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service": { "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:00:47 +0000 (0:00:01.685) 0:04:31.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:00:47 +0000 (0:00:00.049) 0:04:31.442 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d86e9dd83\x2dd809\x2d42e7\x2d9e1a\x2d14301954a2aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket \"dev-mapper-foo\\\\x2dtest1.device\" cryptsetup-pre.target systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.target\" cryptsetup.target umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:00:28 EDT", "StateChangeTimestampMonotonic": "2417889349", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:00:48 +0000 (0:00:00.693) 0:04:32.136 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-86e9dd83-d809-42e7-9e1a-14301954a2aa' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:00:49 +0000 (0:00:01.321) 0:04:33.457 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-86e9dd83-d809-42e7-9e1a-14301954a2aa' in safe mode due to encryption removal"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:00:49 +0000 (0:00:00.043) 0:04:33.501 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d86e9dd83\x2dd809\x2d42e7\x2d9e1a\x2d14301954a2aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:00:28 EDT", "StateChangeTimestampMonotonic": "2417889349", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:477 Wednesday 01 June 2022 17:00:50 +0000 (0:00:00.712) 0:04:34.214 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:483 Wednesday 01 June 2022 17:00:50 +0000 (0:00:00.038) 0:04:34.253 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:00:50 +0000 (0:00:00.037) 0:04:34.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102841.0311215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102841.0311215, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102841.0311215, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1104072144", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:00:50 +0000 (0:00:00.407) 0:04:34.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:494 Wednesday 01 June 2022 17:00:50 +0000 (0:00:00.038) 0:04:34.735 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:00:50 +0000 (0:00:00.052) 0:04:34.787 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:00:50 +0000 (0:00:00.046) 0:04:34.834 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:00:51 +0000 (0:00:00.552) 0:04:35.387 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:00:51 +0000 (0:00:00.078) 0:04:35.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:00:51 +0000 (0:00:00.033) 0:04:35.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:00:51 +0000 (0:00:00.034) 0:04:35.534 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:00:51 +0000 (0:00:00.068) 0:04:35.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:00:51 +0000 (0:00:00.028) 0:04:35.631 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:00:52 +0000 (0:00:00.946) 0:04:36.577 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:00:52 +0000 (0:00:00.038) 0:04:36.616 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:00:52 +0000 (0:00:00.038) 0:04:36.654 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:00:54 +0000 (0:00:01.282) 0:04:37.937 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:00:54 +0000 (0:00:00.062) 0:04:37.999 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:00:54 +0000 (0:00:00.031) 0:04:38.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:00:54 +0000 (0:00:00.032) 0:04:38.063 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:00:54 +0000 (0:00:00.031) 0:04:38.094 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:00:55 +0000 (0:00:00.824) 0:04:38.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service": { "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:00:56 +0000 (0:00:01.739) 0:04:40.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:00:56 +0000 (0:00:00.053) 0:04:40.713 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d86e9dd83\x2dd809\x2d42e7\x2d9e1a\x2d14301954a2aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"dev-mapper-foo\\\\x2dtest1.device\" cryptsetup-pre.target systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.target\" umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:00:28 EDT", "StateChangeTimestampMonotonic": "2417889349", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:00:57 +0000 (0:00:00.700) 0:04:41.413 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:00:59 +0000 (0:00:02.021) 0:04:43.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:00:59 +0000 (0:00:00.033) 0:04:43.469 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d86e9dd83\x2dd809\x2d42e7\x2d9e1a\x2d14301954a2aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:00:28 EDT", "StateChangeTimestampMonotonic": "2417889349", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:01:00 +0000 (0:00:00.677) 0:04:44.146 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:01:00 +0000 (0:00:00.043) 0:04:44.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:01:00 +0000 (0:00:00.040) 0:04:44.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:01:00 +0000 (0:00:00.036) 0:04:44.267 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-86e9dd83-d809-42e7-9e1a-14301954a2aa" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:01:00 +0000 (0:00:00.400) 0:04:44.668 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:01:01 +0000 (0:00:00.651) 0:04:45.319 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:01:01 +0000 (0:00:00.431) 0:04:45.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:01:02 +0000 (0:00:00.663) 0:04:46.414 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102812.5911214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9c0f5107d9926c6bd40aa9fc6ddd0d5b722b017d", "ctime": 1654102810.3791215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8730271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102810.3781216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2747625667", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:01:02 +0000 (0:00:00.406) 0:04:46.821 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-86e9dd83-d809-42e7-9e1a-14301954a2aa', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:01:03 +0000 (0:00:00.415) 0:04:47.237 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:510 Wednesday 01 June 2022 17:01:04 +0000 (0:00:00.851) 0:04:48.088 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:01:04 +0000 (0:00:00.051) 0:04:48.139 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:01:04 +0000 (0:00:00.042) 0:04:48.182 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:01:04 +0000 (0:00:00.032) 0:04:48.215 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "7d88bc0c-ab85-4c15-81c3-b33b6c6aa3ed" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:01:04 +0000 (0:00:00.384) 0:04:48.599 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002926", "end": "2022-06-01 13:01:04.476345", "rc": 0, "start": "2022-06-01 13:01:04.473419" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:01:05 +0000 (0:00:00.384) 0:04:48.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002349", "end": "2022-06-01 13:01:04.855332", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:01:04.852983" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:01:05 +0000 (0:00:00.377) 0:04:49.361 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:01:05 +0000 (0:00:00.069) 0:04:49.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:01:05 +0000 (0:00:00.033) 0:04:49.464 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:01:05 +0000 (0:00:00.065) 0:04:49.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:01:05 +0000 (0:00:00.041) 0:04:49.571 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.378) 0:04:49.949 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.044) 0:04:49.993 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.040) 0:04:50.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.037) 0:04:50.071 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.037) 0:04:50.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.032) 0:04:50.140 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.044) 0:04:50.185 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.056) 0:04:50.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.032) 0:04:50.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.030) 0:04:50.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.033) 0:04:50.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.032) 0:04:50.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.032) 0:04:50.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.032) 0:04:50.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.032) 0:04:50.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.032) 0:04:50.502 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.109) 0:04:50.612 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.064) 0:04:50.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.033) 0:04:50.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.034) 0:04:50.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.031) 0:04:50.776 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.063) 0:04:50.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:01:06 +0000 (0:00:00.037) 0:04:50.876 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.038) 0:04:50.915 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.059) 0:04:50.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.037) 0:04:51.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.037) 0:04:51.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.031) 0:04:51.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.039) 0:04:51.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.033) 0:04:51.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.031) 0:04:51.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.034) 0:04:51.221 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.071) 0:04:51.292 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.069) 0:04:51.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.032) 0:04:51.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.032) 0:04:51.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.033) 0:04:51.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.031) 0:04:51.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.031) 0:04:51.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.032) 0:04:51.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.033) 0:04:51.589 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.033) 0:04:51.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.035) 0:04:51.659 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.063) 0:04:51.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:01:07 +0000 (0:00:00.044) 0:04:51.767 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.129) 0:04:51.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.041) 0:04:51.937 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "7d88bc0c-ab85-4c15-81c3-b33b6c6aa3ed" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "7d88bc0c-ab85-4c15-81c3-b33b6c6aa3ed" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.046) 0:04:51.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.038) 0:04:52.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.036) 0:04:52.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.036) 0:04:52.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.030) 0:04:52.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.031) 0:04:52.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.030) 0:04:52.188 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.032) 0:04:52.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.046) 0:04:52.267 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.035) 0:04:52.303 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.035) 0:04:52.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.032) 0:04:52.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.031) 0:04:52.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.037) 0:04:52.441 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.036) 0:04:52.478 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102858.8661215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102858.8661215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12444, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102858.8661215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:01:08 +0000 (0:00:00.378) 0:04:52.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.041) 0:04:52.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.037) 0:04:52.934 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.033) 0:04:52.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.032) 0:04:53.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.036) 0:04:53.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.030) 0:04:53.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.031) 0:04:53.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.030) 0:04:53.130 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.037) 0:04:53.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.034) 0:04:53.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.032) 0:04:53.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.032) 0:04:53.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.031) 0:04:53.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.031) 0:04:53.330 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.038) 0:04:53.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.038) 0:04:53.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.032) 0:04:53.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.030) 0:04:53.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.031) 0:04:53.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.032) 0:04:53.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.032) 0:04:53.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.034) 0:04:53.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.032) 0:04:53.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.031) 0:04:53.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.030) 0:04:53.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.030) 0:04:53.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:01:09 +0000 (0:00:00.031) 0:04:53.758 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.385) 0:04:54.143 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.393) 0:04:54.537 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.040) 0:04:54.577 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.035) 0:04:54.613 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.032) 0:04:54.646 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.032) 0:04:54.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.032) 0:04:54.712 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.032) 0:04:54.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.032) 0:04:54.777 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.038) 0:04:54.816 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:01:10 +0000 (0:00:00.035) 0:04:54.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.040) 0:04:54.892 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035404", "end": "2022-06-01 13:01:10.805558", "rc": 0, "start": "2022-06-01 13:01:10.770154" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.419) 0:04:55.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.042) 0:04:55.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.043) 0:04:55.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.036) 0:04:55.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.035) 0:04:55.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.033) 0:04:55.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.035) 0:04:55.538 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.034) 0:04:55.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.032) 0:04:55.605 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.030) 0:04:55.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:01:11 +0000 (0:00:00.034) 0:04:55.670 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:516 Wednesday 01 June 2022 17:01:12 +0000 (0:00:00.391) 0:04:56.061 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:01:12 +0000 (0:00:00.057) 0:04:56.119 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:01:12 +0000 (0:00:00.046) 0:04:56.166 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:01:12 +0000 (0:00:00.545) 0:04:56.712 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:01:12 +0000 (0:00:00.088) 0:04:56.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:01:12 +0000 (0:00:00.034) 0:04:56.835 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:01:12 +0000 (0:00:00.033) 0:04:56.869 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:01:13 +0000 (0:00:00.130) 0:04:56.999 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:01:13 +0000 (0:00:00.028) 0:04:57.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:01:14 +0000 (0:00:00.885) 0:04:57.914 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:01:14 +0000 (0:00:00.042) 0:04:57.956 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:01:14 +0000 (0:00:00.036) 0:04:57.992 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:01:15 +0000 (0:00:01.259) 0:04:59.252 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:01:15 +0000 (0:00:00.054) 0:04:59.306 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:01:15 +0000 (0:00:00.030) 0:04:59.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:01:15 +0000 (0:00:00.031) 0:04:59.368 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:01:15 +0000 (0:00:00.029) 0:04:59.398 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:01:16 +0000 (0:00:00.878) 0:05:00.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service": { "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:01:18 +0000 (0:00:01.743) 0:05:02.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:01:18 +0000 (0:00:00.050) 0:05:02.071 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d86e9dd83\x2dd809\x2d42e7\x2d9e1a\x2d14301954a2aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" \"dev-mapper-foo\\\\x2dtest1.device\" cryptsetup-pre.target systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.target\" cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-86e9dd83-d809-42e7-9e1a-14301954a2aa", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-86e9dd83-d809-42e7-9e1a-14301954a2aa ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:00:28 EDT", "StateChangeTimestampMonotonic": "2417889349", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:01:18 +0000 (0:00:00.701) 0:05:02.772 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:01:20 +0000 (0:00:01.265) 0:05:04.038 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:01:20 +0000 (0:00:00.044) 0:05:04.083 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d86e9dd83\x2dd809\x2d42e7\x2d9e1a\x2d14301954a2aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "name": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d86e9dd83\\x2dd809\\x2d42e7\\x2d9e1a\\x2d14301954a2aa.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d86e9dd83\\\\x2dd809\\\\x2d42e7\\\\x2d9e1a\\\\x2d14301954a2aa.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:536 Wednesday 01 June 2022 17:01:20 +0000 (0:00:00.694) 0:05:04.778 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:542 Wednesday 01 June 2022 17:01:20 +0000 (0:00:00.038) 0:05:04.817 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:01:20 +0000 (0:00:00.037) 0:05:04.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102871.5511215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102871.5511215, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102871.5511215, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "296220129", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:01:21 +0000 (0:00:00.371) 0:05:05.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:553 Wednesday 01 June 2022 17:01:21 +0000 (0:00:00.044) 0:05:05.270 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:01:21 +0000 (0:00:00.052) 0:05:05.322 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:01:21 +0000 (0:00:00.047) 0:05:05.369 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:01:22 +0000 (0:00:00.535) 0:05:05.904 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:01:22 +0000 (0:00:00.074) 0:05:05.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:01:22 +0000 (0:00:00.033) 0:05:06.012 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:01:22 +0000 (0:00:00.032) 0:05:06.044 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:01:22 +0000 (0:00:00.065) 0:05:06.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:01:22 +0000 (0:00:00.098) 0:05:06.209 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:01:23 +0000 (0:00:00.881) 0:05:07.091 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:01:23 +0000 (0:00:00.039) 0:05:07.130 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:01:23 +0000 (0:00:00.034) 0:05:07.164 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:01:24 +0000 (0:00:01.222) 0:05:08.387 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:01:24 +0000 (0:00:00.055) 0:05:08.443 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:01:24 +0000 (0:00:00.030) 0:05:08.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:01:24 +0000 (0:00:00.031) 0:05:08.505 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:01:24 +0000 (0:00:00.029) 0:05:08.535 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:01:25 +0000 (0:00:00.879) 0:05:09.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:01:27 +0000 (0:00:01.723) 0:05:11.138 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:01:27 +0000 (0:00:00.045) 0:05:11.184 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:01:27 +0000 (0:00:00.029) 0:05:11.214 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:01:39 +0000 (0:00:12.159) 0:05:23.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:01:39 +0000 (0:00:00.033) 0:05:23.407 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:01:39 +0000 (0:00:00.030) 0:05:23.437 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:01:39 +0000 (0:00:00.043) 0:05:23.480 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:01:39 +0000 (0:00:00.045) 0:05:23.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:01:39 +0000 (0:00:00.039) 0:05:23.565 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:01:40 +0000 (0:00:00.399) 0:05:23.964 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:01:40 +0000 (0:00:00.668) 0:05:24.633 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:01:41 +0000 (0:00:00.405) 0:05:25.039 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:01:41 +0000 (0:00:00.648) 0:05:25.687 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102864.8541214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102862.7121215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417711, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654102862.7101214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3948864398", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:01:42 +0000 (0:00:00.390) 0:05:26.077 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-662f5f4f-ad4e-4732-a294-8e3657855c6b', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:01:42 +0000 (0:00:00.406) 0:05:26.484 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:569 Wednesday 01 June 2022 17:01:43 +0000 (0:00:00.873) 0:05:27.358 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:01:43 +0000 (0:00:00.052) 0:05:27.410 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:01:43 +0000 (0:00:00.044) 0:05:27.455 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:01:43 +0000 (0:00:00.104) 0:05:27.559 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "662f5f4f-ad4e-4732-a294-8e3657855c6b" }, "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "size": "4G", "type": "crypt", "uuid": "e0c526a9-b144-48a5-a56a-101c33f94315" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:01:44 +0000 (0:00:00.369) 0:05:27.928 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002512", "end": "2022-06-01 13:01:43.785130", "rc": 0, "start": "2022-06-01 13:01:43.782618" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:01:44 +0000 (0:00:00.365) 0:05:28.294 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002528", "end": "2022-06-01 13:01:44.151685", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:01:44.149157" } STDOUT: luks-662f5f4f-ad4e-4732-a294-8e3657855c6b /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:01:44 +0000 (0:00:00.370) 0:05:28.664 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:01:44 +0000 (0:00:00.069) 0:05:28.734 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:01:44 +0000 (0:00:00.033) 0:05:28.767 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:01:44 +0000 (0:00:00.065) 0:05:28.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:01:44 +0000 (0:00:00.041) 0:05:28.874 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.388) 0:05:29.262 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.046) 0:05:29.309 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.039) 0:05:29.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.038) 0:05:29.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.036) 0:05:29.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.035) 0:05:29.460 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.045) 0:05:29.505 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.058) 0:05:29.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.032) 0:05:29.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.034) 0:05:29.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.032) 0:05:29.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.034) 0:05:29.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.031) 0:05:29.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.032) 0:05:29.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.032) 0:05:29.795 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:01:45 +0000 (0:00:00.037) 0:05:29.833 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.061) 0:05:29.894 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.062) 0:05:29.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.034) 0:05:29.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.032) 0:05:30.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.032) 0:05:30.056 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.064) 0:05:30.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.039) 0:05:30.159 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.036) 0:05:30.196 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.057) 0:05:30.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.037) 0:05:30.291 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.036) 0:05:30.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.082) 0:05:30.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.032) 0:05:30.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.033) 0:05:30.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.045) 0:05:30.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.042) 0:05:30.563 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.078) 0:05:30.642 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.068) 0:05:30.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.031) 0:05:30.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.039) 0:05:30.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.034) 0:05:30.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.033) 0:05:30.849 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:01:46 +0000 (0:00:00.032) 0:05:30.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.031) 0:05:30.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.034) 0:05:30.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.036) 0:05:30.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.033) 0:05:31.018 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.061) 0:05:31.080 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.036) 0:05:31.116 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.123) 0:05:31.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.038) 0:05:31.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1026386, "block_size": 4096, "block_total": 1041920, "block_used": 15534, "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fstype": "xfs", "inode_available": 2088957, "inode_total": 2088960, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4204077056, "size_total": 4267704320, "uuid": "e0c526a9-b144-48a5-a56a-101c33f94315" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1026386, "block_size": 4096, "block_total": 1041920, "block_used": 15534, "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fstype": "xfs", "inode_available": 2088957, "inode_total": 2088960, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4204077056, "size_total": 4267704320, "uuid": "e0c526a9-b144-48a5-a56a-101c33f94315" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.046) 0:05:31.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.039) 0:05:31.364 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.035) 0:05:31.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.038) 0:05:31.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.033) 0:05:31.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.030) 0:05:31.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.032) 0:05:31.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.033) 0:05:31.569 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.048) 0:05:31.617 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.036) 0:05:31.653 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.039) 0:05:31.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.031) 0:05:31.725 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.038) 0:05:31.764 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.042) 0:05:31.806 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:01:47 +0000 (0:00:00.039) 0:05:31.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102898.6221216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102898.6221216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12444, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102898.6221216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:01:48 +0000 (0:00:00.375) 0:05:32.221 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:01:48 +0000 (0:00:00.039) 0:05:32.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:01:48 +0000 (0:00:00.039) 0:05:32.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:01:48 +0000 (0:00:00.036) 0:05:32.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:01:48 +0000 (0:00:00.033) 0:05:32.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:01:48 +0000 (0:00:00.037) 0:05:32.408 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102898.7781215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102898.7781215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12602, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102898.7781215, "nlink": 1, "path": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:01:48 +0000 (0:00:00.371) 0:05:32.780 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.012847", "end": "2022-06-01 13:01:48.670016", "rc": 0, "start": "2022-06-01 13:01:48.657169" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 662f5f4f-ad4e-4732-a294-8e3657855c6b Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 11 Memory: 906462 Threads: 4 Salt: 8c de b7 70 0a d2 31 8d aa 31 e3 aa 27 4d 8e ff d3 ec 81 89 63 e2 65 df 83 98 de ed bd 64 fd 99 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 97669 Salt: ce 6d 9d ba 72 8b 74 f9 27 f0 c8 0b a4 10 e0 b7 39 ba dc 59 e0 a7 2c c7 44 b9 c5 90 1f 67 a6 d6 Digest: 9e 6b bf e0 82 1c 01 73 b6 ce 95 0b b2 3b b6 6f 97 44 e4 dd 51 20 ac 87 d0 bb 73 b7 ae 71 ee 09 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.400) 0:05:33.181 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.041) 0:05:33.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.038) 0:05:33.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.037) 0:05:33.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.038) 0:05:33.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.033) 0:05:33.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.032) 0:05:33.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.031) 0:05:33.434 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-662f5f4f-ad4e-4732-a294-8e3657855c6b /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.040) 0:05:33.474 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.035) 0:05:33.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.038) 0:05:33.548 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.040) 0:05:33.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.038) 0:05:33.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.031) 0:05:33.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.031) 0:05:33.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.032) 0:05:33.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.033) 0:05:33.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.034) 0:05:33.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.033) 0:05:33.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:01:49 +0000 (0:00:00.032) 0:05:33.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.034) 0:05:33.892 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.392) 0:05:34.284 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.380) 0:05:34.664 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.040) 0:05:34.704 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.035) 0:05:34.740 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.034) 0:05:34.775 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.032) 0:05:34.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.032) 0:05:34.840 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:01:50 +0000 (0:00:00.031) 0:05:34.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.031) 0:05:34.903 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.037) 0:05:34.940 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.037) 0:05:34.977 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.038) 0:05:35.016 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.043112", "end": "2022-06-01 13:01:50.930690", "rc": 0, "start": "2022-06-01 13:01:50.887578" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.423) 0:05:35.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.042) 0:05:35.482 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.043) 0:05:35.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.035) 0:05:35.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.034) 0:05:35.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.036) 0:05:35.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.035) 0:05:35.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.033) 0:05:35.702 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.032) 0:05:35.734 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.030) 0:05:35.765 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:571 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.032) 0:05:35.797 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:01:51 +0000 (0:00:00.063) 0:05:35.861 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:01:52 +0000 (0:00:00.047) 0:05:35.908 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:01:52 +0000 (0:00:00.528) 0:05:36.436 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:01:52 +0000 (0:00:00.083) 0:05:36.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:01:52 +0000 (0:00:00.046) 0:05:36.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:01:52 +0000 (0:00:00.036) 0:05:36.602 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:01:52 +0000 (0:00:00.070) 0:05:36.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:01:52 +0000 (0:00:00.029) 0:05:36.703 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:01:53 +0000 (0:00:00.903) 0:05:37.606 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:01:53 +0000 (0:00:00.037) 0:05:37.643 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:01:53 +0000 (0:00:00.039) 0:05:37.682 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:01:55 +0000 (0:00:01.282) 0:05:38.965 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:01:55 +0000 (0:00:00.059) 0:05:39.024 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:01:55 +0000 (0:00:00.030) 0:05:39.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:01:55 +0000 (0:00:00.034) 0:05:39.090 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:01:55 +0000 (0:00:00.031) 0:05:39.121 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:01:56 +0000 (0:00:00.833) 0:05:39.955 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:01:57 +0000 (0:00:01.696) 0:05:41.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:01:57 +0000 (0:00:00.050) 0:05:41.701 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:01:57 +0000 (0:00:00.031) 0:05:41.733 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:01:59 +0000 (0:00:02.138) 0:05:43.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:02:00 +0000 (0:00:00.033) 0:05:43.905 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:02:00 +0000 (0:00:00.029) 0:05:43.934 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:02:00 +0000 (0:00:00.041) 0:05:43.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:02:00 +0000 (0:00:00.039) 0:05:44.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:02:00 +0000 (0:00:00.038) 0:05:44.054 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-662f5f4f-ad4e-4732-a294-8e3657855c6b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:02:00 +0000 (0:00:00.404) 0:05:44.458 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:02:01 +0000 (0:00:00.670) 0:05:45.129 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:02:01 +0000 (0:00:00.034) 0:05:45.163 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:02:01 +0000 (0:00:00.643) 0:05:45.807 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102904.1511216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f91c0a88faf0ba740913deeef8a2e00ab38ef1cb", "ctime": 1654102901.9581215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8730271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102901.9571216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2747625683", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:02:02 +0000 (0:00:00.385) 0:05:46.192 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-662f5f4f-ad4e-4732-a294-8e3657855c6b', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:02:02 +0000 (0:00:00.406) 0:05:46.598 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:581 Wednesday 01 June 2022 17:02:03 +0000 (0:00:00.844) 0:05:47.443 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:02:03 +0000 (0:00:00.053) 0:05:47.497 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:02:03 +0000 (0:00:00.032) 0:05:47.530 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=KO16yV-b5Uw-n7Jk-9qTn-zvQP-1MuD-DJK1Po", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:02:03 +0000 (0:00:00.039) 0:05:47.569 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:02:04 +0000 (0:00:00.394) 0:05:47.964 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002482", "end": "2022-06-01 13:02:03.822102", "rc": 0, "start": "2022-06-01 13:02:03.819620" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:02:04 +0000 (0:00:00.364) 0:05:48.329 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003180", "end": "2022-06-01 13:02:04.215960", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:02:04.212780" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:02:04 +0000 (0:00:00.396) 0:05:48.725 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:02:04 +0000 (0:00:00.029) 0:05:48.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:02:04 +0000 (0:00:00.031) 0:05:48.786 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:02:04 +0000 (0:00:00.063) 0:05:48.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.039) 0:05:48.889 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.114) 0:05:49.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.035) 0:05:49.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.041) 0:05:49.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.029) 0:05:49.111 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.037) 0:05:49.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.036) 0:05:49.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.034) 0:05:49.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.033) 0:05:49.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.031) 0:05:49.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.032) 0:05:49.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.047) 0:05:49.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.026) 0:05:49.390 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.037) 0:05:49.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.028) 0:05:49.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.029) 0:05:49.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.030) 0:05:49.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:02:05 +0000 (0:00:00.027) 0:05:49.543 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102919.2471216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102919.2471216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654102919.2471216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.389) 0:05:49.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.036) 0:05:49.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.025) 0:05:49.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.053) 0:05:50.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.033) 0:05:50.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.026) 0:05:50.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.030) 0:05:50.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.030) 0:05:50.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.389 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.042) 0:05:50.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.041) 0:05:50.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.030) 0:05:50.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.039) 0:05:50.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.032) 0:05:50.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.029) 0:05:50.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.033) 0:05:50.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.032) 0:05:50.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.032) 0:05:50.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.034) 0:05:50.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:02:06 +0000 (0:00:00.031) 0:05:50.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.032) 0:05:50.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.031) 0:05:50.928 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.034) 0:05:50.963 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.031) 0:05:50.994 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.092) 0:05:51.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.032) 0:05:51.119 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.032) 0:05:51.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.030) 0:05:51.182 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.035) 0:05:51.218 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.033) 0:05:51.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.034) 0:05:51.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.031) 0:05:51.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.030) 0:05:51.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.030) 0:05:51.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.030) 0:05:51.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.030) 0:05:51.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.032) 0:05:51.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.031) 0:05:51.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.031) 0:05:51.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1161 changed=61 unreachable=0 failed=9 skipped=648 rescued=9 ignored=0 Wednesday 01 June 2022 17:02:07 +0000 (0:00:00.016) 0:05:51.550 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state -- 16.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state -- 14.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state -- 12.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state -- 11.12s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 9.60s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.14s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:02:08 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:02:09 +0000 (0:00:01.285) 0:00:01.308 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml PLAYBOOK: tests_luks_nvme_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_luks_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:02:09 +0000 (0:00:00.079) 0:00:01.387 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:02:10 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:02:11 +0000 (0:00:01.240) 0:00:01.264 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml PLAYBOOK: tests_luks_pool.yml ************************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_luks_pool.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:2 Wednesday 01 June 2022 17:02:11 +0000 (0:00:00.040) 0:00:01.304 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:14 Wednesday 01 June 2022 17:02:12 +0000 (0:00:01.077) 0:00:02.382 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:02:12 +0000 (0:00:00.036) 0:00:02.419 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:02:13 +0000 (0:00:00.159) 0:00:02.578 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:02:13 +0000 (0:00:00.517) 0:00:03.095 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:02:13 +0000 (0:00:00.077) 0:00:03.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:02:13 +0000 (0:00:00.023) 0:00:03.196 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:02:13 +0000 (0:00:00.022) 0:00:03.218 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:02:13 +0000 (0:00:00.196) 0:00:03.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:02:13 +0000 (0:00:00.019) 0:00:03.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:02:15 +0000 (0:00:01.045) 0:00:04.480 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:02:15 +0000 (0:00:00.047) 0:00:04.527 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:02:15 +0000 (0:00:00.045) 0:00:04.572 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:02:15 +0000 (0:00:00.670) 0:00:05.243 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:02:15 +0000 (0:00:00.082) 0:00:05.326 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:02:15 +0000 (0:00:00.019) 0:00:05.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:02:15 +0000 (0:00:00.021) 0:00:05.367 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:02:15 +0000 (0:00:00.019) 0:00:05.387 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:02:16 +0000 (0:00:00.880) 0:00:06.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service": { "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:02:18 +0000 (0:00:01.847) 0:00:08.115 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:02:18 +0000 (0:00:00.044) 0:00:08.159 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-udevd-kernel.socket \"dev-mapper-foo\\\\x2dtest1.device\" cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.target\" umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-662f5f4f-ad4e-4732-a294-8e3657855c6b", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-662f5f4f-ad4e-4732-a294-8e3657855c6b /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-662f5f4f-ad4e-4732-a294-8e3657855c6b /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-662f5f4f-ad4e-4732-a294-8e3657855c6b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-662f5f4f-ad4e-4732-a294-8e3657855c6b ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:02:01 EDT", "StateChangeTimestampMonotonic": "2510314205", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:02:19 +0000 (0:00:00.982) 0:00:09.141 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:02:20 +0000 (0:00:00.528) 0:00:09.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:02:20 +0000 (0:00:00.029) 0:00:09.700 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:02:20 +0000 (0:00:00.683) 0:00:10.383 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:02:20 +0000 (0:00:00.031) 0:00:10.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:02:20 +0000 (0:00:00.034) 0:00:10.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:02:21 +0000 (0:00:00.031) 0:00:10.480 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:02:21 +0000 (0:00:00.027) 0:00:10.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:02:21 +0000 (0:00:00.027) 0:00:10.536 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:02:21 +0000 (0:00:00.025) 0:00:10.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:02:21 +0000 (0:00:00.030) 0:00:10.592 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102924.2151215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102922.0751214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8730272, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654102922.0741215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "753644764", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:02:21 +0000 (0:00:00.511) 0:00:11.103 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:02:21 +0000 (0:00:00.029) 0:00:11.132 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:17 Wednesday 01 June 2022 17:02:22 +0000 (0:00:00.826) 0:00:11.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:23 Wednesday 01 June 2022 17:02:22 +0000 (0:00:00.029) 0:00:11.989 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:02:22 +0000 (0:00:00.049) 0:00:12.038 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.527) 0:00:12.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.033) 0:00:12.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.029) 0:00:12.629 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create an encrypted lvm pool] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:34 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.031) 0:00:12.661 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.048) 0:00:12.710 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.043) 0:00:12.753 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.535) 0:00:13.288 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.068) 0:00:13.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.029) 0:00:13.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.029) 0:00:13.415 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:02:23 +0000 (0:00:00.060) 0:00:13.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:02:24 +0000 (0:00:00.024) 0:00:13.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:02:24 +0000 (0:00:00.028) 0:00:13.529 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:02:24 +0000 (0:00:00.034) 0:00:13.563 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:02:24 +0000 (0:00:00.031) 0:00:13.595 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:02:25 +0000 (0:00:01.033) 0:00:14.628 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:02:25 +0000 (0:00:00.055) 0:00:14.684 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:02:25 +0000 (0:00:00.027) 0:00:14.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:02:25 +0000 (0:00:00.032) 0:00:14.744 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:02:25 +0000 (0:00:00.026) 0:00:14.770 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:02:26 +0000 (0:00:00.900) 0:00:15.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:02:26 +0000 (0:00:00.030) 0:00:15.701 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:02:26 +0000 (0:00:00.047) 0:00:15.748 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:02:26 +0000 (0:00:00.725) 0:00:16.474 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted pool 'foo' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:02:28 +0000 (0:00:01.023) 0:00:17.498 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': True, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'4g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted pool 'foo' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:02:28 +0000 (0:00:00.041) 0:00:17.539 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:53 Wednesday 01 June 2022 17:02:28 +0000 (0:00:00.691) 0:00:18.230 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:59 Wednesday 01 June 2022 17:02:28 +0000 (0:00:00.034) 0:00:18.265 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:66 Wednesday 01 June 2022 17:02:28 +0000 (0:00:00.033) 0:00:18.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [Create a key file] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:75 Wednesday 01 June 2022 17:02:28 +0000 (0:00:00.032) 0:00:18.331 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testp8ufvdbvlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:82 Wednesday 01 June 2022 17:02:29 +0000 (0:00:00.535) 0:00:18.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testp8ufvdbvlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1654102949.44-79411-59115887350867/source", "state": "file", "uid": 0 } TASK [Create an encrypted lvm pool using a key file] *************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:89 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.782) 0:00:19.649 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.047) 0:00:19.697 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.043) 0:00:19.741 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.512) 0:00:20.253 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.071) 0:00:20.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.030) 0:00:20.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.028) 0:00:20.384 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.062) 0:00:20.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:02:30 +0000 (0:00:00.024) 0:00:20.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.027) 0:00:20.498 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_key": "/tmp/storage_testp8ufvdbvlukskey", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.036) 0:00:20.534 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.031) 0:00:20.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.029) 0:00:20.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.031) 0:00:20.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.029) 0:00:20.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.028) 0:00:20.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.042) 0:00:20.728 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:02:31 +0000 (0:00:00.695) 0:00:21.423 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "password": "/tmp/storage_testp8ufvdbvlukskey", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp8ufvdbvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:02:42 +0000 (0:00:10.931) 0:00:32.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:02:42 +0000 (0:00:00.032) 0:00:32.387 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:02:43 +0000 (0:00:00.699) 0:00:33.086 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "password": "/tmp/storage_testp8ufvdbvlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp8ufvdbvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:02:43 +0000 (0:00:00.040) 0:00:33.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp8ufvdbvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:02:43 +0000 (0:00:00.037) 0:00:33.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:02:43 +0000 (0:00:00.061) 0:00:33.226 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:02:43 +0000 (0:00:00.028) 0:00:33.254 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:02:44 +0000 (0:00:00.651) 0:00:33.905 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:02:44 +0000 (0:00:00.493) 0:00:34.399 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:02:45 +0000 (0:00:00.657) 0:00:35.056 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102924.2151215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102922.0751214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8730272, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654102922.0741215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "753644764", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:02:45 +0000 (0:00:00.376) 0:00:35.433 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'/tmp/storage_testp8ufvdbvlukskey', u'name': u'luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "password": "/tmp/storage_testp8ufvdbvlukskey", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:02:46 +0000 (0:00:00.531) 0:00:35.964 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:104 Wednesday 01 June 2022 17:02:47 +0000 (0:00:00.879) 0:00:36.844 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:02:47 +0000 (0:00:00.048) 0:00:36.893 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testp8ufvdbvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:02:47 +0000 (0:00:00.037) 0:00:36.930 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:02:47 +0000 (0:00:00.028) 0:00:36.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "e8497e84-0087-47aa-af3d-f29a1f7c238c" }, "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db": { "fstype": "LVM2_member", "label": "", "name": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "size": "10G", "type": "crypt", "uuid": "1iZACH-VdqP-P9fx-m51M-EZbG-fQ40-mStrcT" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "338c0e7f-0607-41b7-8317-7c0ca8f4b7db" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:02:48 +0000 (0:00:00.535) 0:00:37.494 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003202", "end": "2022-06-01 13:02:47.907368", "rc": 0, "start": "2022-06-01 13:02:47.904166" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:02:48 +0000 (0:00:00.515) 0:00:38.010 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002502", "end": "2022-06-01 13:02:48.284042", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:02:48.281540" } STDOUT: luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db /dev/sda /tmp/storage_testp8ufvdbvlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:02:48 +0000 (0:00:00.371) 0:00:38.381 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:02:48 +0000 (0:00:00.064) 0:00:38.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.033) 0:00:38.479 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.063) 0:00:38.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.044) 0:00:38.587 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "pv": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.493) 0:00:39.081 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.042) 0:00:39.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.038) 0:00:39.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "crypt" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.035) 0:00:39.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.030) 0:00:39.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.029) 0:00:39.257 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.041) 0:00:39.298 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.059) 0:00:39.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.032) 0:00:39.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.030) 0:00:39.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:02:49 +0000 (0:00:00.031) 0:00:39.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.030) 0:00:39.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.030) 0:00:39.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.028) 0:00:39.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.027) 0:00:39.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.027) 0:00:39.595 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.057) 0:00:39.652 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.056) 0:00:39.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.028) 0:00:39.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.027) 0:00:39.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.028) 0:00:39.794 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.058) 0:00:39.853 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testp8ufvdbvlukskey" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.032) 0:00:39.886 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml for /cache/rhel-x.qcow2 TASK [Get the backing device path] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:1 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.056) 0:00:39.943 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/disk/by-uuid/338c0e7f-0607-41b7-8317-7c0ca8f4b7db" ], "delta": "0:00:00.002518", "end": "2022-06-01 13:02:50.219166", "rc": 0, "start": "2022-06-01 13:02:50.216648" } STDOUT: /dev/sda TASK [Collect LUKS info for this member] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:6 Wednesday 01 June 2022 17:02:50 +0000 (0:00:00.380) 0:00:40.324 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011993", "end": "2022-06-01 13:02:50.608550", "rc": 0, "start": "2022-06-01 13:02:50.596557" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 338c0e7f-0607-41b7-8317-7c0ca8f4b7db Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 9 Memory: 906462 Threads: 4 Salt: 17 40 0c bc 5e 0e 49 42 6a 46 38 b6 b5 47 81 1b 92 4d 39 ab 0d f5 d2 8a 4a 84 e8 3c 0a fd 1d a0 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 97523 Salt: 93 5c e8 e2 16 1b 12 f9 d4 0e d4 15 19 62 03 5b 6d f9 24 de d3 e7 ce 89 fe d9 b4 67 92 5f b7 a4 Digest: e6 25 61 d7 bf 51 4a 98 e8 8b 23 2f 1a ef 2a 52 64 68 45 c1 65 1e 3f 1e 36 2d e4 c8 cd cd 5d 99 TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:12 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.390) 0:00:40.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:18 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.031) 0:00:40.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:24 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.028) 0:00:40.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.027) 0:00:40.801 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.057) 0:00:40.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db /dev/sda /tmp/storage_testp8ufvdbvlukskey" ] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.036) 0:00:40.896 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.036) 0:00:40.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.041) 0:00:40.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.031) 0:00:41.006 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.036) 0:00:41.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.029) 0:00:41.071 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.031) 0:00:41.103 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.064) 0:00:41.167 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.104) 0:00:41.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.032) 0:00:41.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.031) 0:00:41.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.031) 0:00:41.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.030) 0:00:41.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.031) 0:00:41.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:02:51 +0000 (0:00:00.033) 0:00:41.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.031) 0:00:41.495 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.029) 0:00:41.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.030) 0:00:41.555 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.057) 0:00:41.612 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.035) 0:00:41.648 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.123) 0:00:41.772 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.034) 0:00:41.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "e8497e84-0087-47aa-af3d-f29a1f7c238c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "e8497e84-0087-47aa-af3d-f29a1f7c238c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.040) 0:00:41.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.038) 0:00:41.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.034) 0:00:41.920 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.037) 0:00:41.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.030) 0:00:41.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.032) 0:00:42.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.031) 0:00:42.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.034) 0:00:42.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.050) 0:00:42.138 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.036) 0:00:42.175 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.037) 0:00:42.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.031) 0:00:42.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.031) 0:00:42.275 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.039) 0:00:42.314 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:02:52 +0000 (0:00:00.042) 0:00:42.356 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102962.1881216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102962.1881216, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12856, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102962.1881216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.384) 0:00:42.741 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.035) 0:00:42.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.036) 0:00:42.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.033) 0:00:42.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.029) 0:00:42.876 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.035) 0:00:42.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.028) 0:00:42.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.029) 0:00:42.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.031) 0:00:43.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.036) 0:00:43.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.030) 0:00:43.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.032) 0:00:43.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.030) 0:00:43.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.030) 0:00:43.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.029) 0:00:43.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.036) 0:00:43.228 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.034) 0:00:43.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.032) 0:00:43.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.029) 0:00:43.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.029) 0:00:43.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.030) 0:00:43.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.029) 0:00:43.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:02:53 +0000 (0:00:00.030) 0:00:43.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:02:54 +0000 (0:00:00.073) 0:00:43.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:02:54 +0000 (0:00:00.031) 0:00:43.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:02:54 +0000 (0:00:00.030) 0:00:43.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:02:54 +0000 (0:00:00.031) 0:00:43.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:02:54 +0000 (0:00:00.030) 0:00:43.644 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:02:54 +0000 (0:00:00.467) 0:00:44.112 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.372) 0:00:44.485 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.038) 0:00:44.523 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.035) 0:00:44.558 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.031) 0:00:44.590 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.031) 0:00:44.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.031) 0:00:44.653 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.030) 0:00:44.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.029) 0:00:44.713 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.045) 0:00:44.758 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.034) 0:00:44.793 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.039) 0:00:44.833 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036254", "end": "2022-06-01 13:02:55.134541", "rc": 0, "start": "2022-06-01 13:02:55.098287" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.413) 0:00:45.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.039) 0:00:45.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.040) 0:00:45.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.033) 0:00:45.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.034) 0:00:45.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.031) 0:00:45.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:02:55 +0000 (0:00:00.032) 0:00:45.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:02:56 +0000 (0:00:00.030) 0:00:45.488 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:02:56 +0000 (0:00:00.029) 0:00:45.518 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:02:56 +0000 (0:00:00.028) 0:00:45.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:106 Wednesday 01 June 2022 17:02:56 +0000 (0:00:00.033) 0:00:45.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "path": "/tmp/storage_testp8ufvdbvlukskey", "state": "absent" } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:02:56 +0000 (0:00:00.538) 0:00:46.118 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:116 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.385) 0:00:46.504 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.050) 0:00:46.555 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.053) 0:00:46.608 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.517) 0:00:47.125 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.075) 0:00:47.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.031) 0:00:47.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.030) 0:00:47.262 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.065) 0:00:47.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.029) 0:00:47.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:02:57 +0000 (0:00:00.034) 0:00:47.392 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.101) 0:00:47.494 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.035) 0:00:47.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.031) 0:00:47.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.030) 0:00:47.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.031) 0:00:47.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.030) 0:00:47.653 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.046) 0:00:47.700 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:02:58 +0000 (0:00:00.721) 0:00:48.421 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove and recreate existing pool 'foo' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:03:00 +0000 (0:00:01.271) 0:00:49.693 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': 0, u'encryption_key': None, u'encryption_luks_version': u'luks2', u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'4g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove and recreate existing pool 'foo' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:03:00 +0000 (0:00:00.042) 0:00:49.735 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:134 Wednesday 01 June 2022 17:03:00 +0000 (0:00:00.684) 0:00:50.420 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:140 Wednesday 01 June 2022 17:03:00 +0000 (0:00:00.036) 0:00:50.457 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:03:01 +0000 (0:00:00.035) 0:00:50.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102976.4041214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102976.4041214, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102976.4041214, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2399984928", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:03:01 +0000 (0:00:00.373) 0:00:50.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:151 Wednesday 01 June 2022 17:03:01 +0000 (0:00:00.034) 0:00:50.900 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:03:01 +0000 (0:00:00.051) 0:00:50.951 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:03:01 +0000 (0:00:00.045) 0:00:50.997 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.511) 0:00:51.508 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.069) 0:00:51.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.032) 0:00:51.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.030) 0:00:51.642 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.062) 0:00:51.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.026) 0:00:51.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.063) 0:00:51.794 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.036) 0:00:51.831 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.033) 0:00:51.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.030) 0:00:51.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.030) 0:00:51.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.030) 0:00:51.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.036) 0:00:51.993 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:03:02 +0000 (0:00:00.053) 0:00:52.046 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:03:03 +0000 (0:00:00.722) 0:00:52.768 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:03:05 +0000 (0:00:02.434) 0:00:55.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:03:05 +0000 (0:00:00.030) 0:00:55.234 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:03:06 +0000 (0:00:00.701) 0:00:55.935 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:03:06 +0000 (0:00:00.042) 0:00:55.977 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:03:06 +0000 (0:00:00.039) 0:00:56.017 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:03:06 +0000 (0:00:00.036) 0:00:56.053 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:03:06 +0000 (0:00:00.417) 0:00:56.471 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:03:07 +0000 (0:00:00.681) 0:00:57.152 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:03:08 +0000 (0:00:00.416) 0:00:57.568 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:03:08 +0000 (0:00:00.656) 0:00:58.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102968.2831216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e9059fe8fdc5152f07acc0608bec32a65e132850", "ctime": 1654102965.8541214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792388, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654102965.8531215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 84, "uid": 0, "version": "688806283", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:03:09 +0000 (0:00:00.387) 0:00:58.612 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-338c0e7f-0607-41b7-8317-7c0ca8f4b7db", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:03:09 +0000 (0:00:00.389) 0:00:59.002 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:167 Wednesday 01 June 2022 17:03:10 +0000 (0:00:00.959) 0:00:59.961 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:03:10 +0000 (0:00:00.049) 0:01:00.011 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:03:10 +0000 (0:00:00.040) 0:01:00.051 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:03:10 +0000 (0:00:00.031) 0:01:00.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "ebb51ec3-e7ac-4faf-b90b-3d72cc51105b" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "6Zb0h9-6Gh2-zhpF-vOAo-d6zo-secW-mZTs7n" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:03:10 +0000 (0:00:00.382) 0:01:00.465 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003306", "end": "2022-06-01 13:03:10.745315", "rc": 0, "start": "2022-06-01 13:03:10.742009" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:03:11 +0000 (0:00:00.380) 0:01:00.846 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002714", "end": "2022-06-01 13:03:11.115759", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:03:11.113045" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:03:11 +0000 (0:00:00.371) 0:01:01.217 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:03:11 +0000 (0:00:00.067) 0:01:01.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:03:11 +0000 (0:00:00.032) 0:01:01.318 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:03:11 +0000 (0:00:00.062) 0:01:01.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:03:11 +0000 (0:00:00.045) 0:01:01.427 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.391) 0:01:01.818 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.042) 0:01:01.861 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.037) 0:01:01.898 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.035) 0:01:01.933 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.037) 0:01:01.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.030) 0:01:02.001 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.050) 0:01:02.052 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.057) 0:01:02.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.032) 0:01:02.141 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.030) 0:01:02.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.031) 0:01:02.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.031) 0:01:02.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.029) 0:01:02.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.030) 0:01:02.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.031) 0:01:02.328 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.038) 0:01:02.366 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:03:12 +0000 (0:00:00.064) 0:01:02.431 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.061) 0:01:02.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.032) 0:01:02.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.033) 0:01:02.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.030) 0:01:02.589 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.062) 0:01:02.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.035) 0:01:02.687 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.036) 0:01:02.724 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.108) 0:01:02.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.040) 0:01:02.873 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.038) 0:01:02.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.030) 0:01:02.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.032) 0:01:02.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.030) 0:01:03.005 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.030) 0:01:03.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.031) 0:01:03.066 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.067) 0:01:03.133 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.064) 0:01:03.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.031) 0:01:03.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.030) 0:01:03.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.032) 0:01:03.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.031) 0:01:03.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.031) 0:01:03.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.030) 0:01:03.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.032) 0:01:03.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:03:13 +0000 (0:00:00.032) 0:01:03.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.035) 0:01:03.486 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.060) 0:01:03.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.035) 0:01:03.582 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.123) 0:01:03.705 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.038) 0:01:03.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "ebb51ec3-e7ac-4faf-b90b-3d72cc51105b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "ebb51ec3-e7ac-4faf-b90b-3d72cc51105b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.044) 0:01:03.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.039) 0:01:03.828 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.037) 0:01:03.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.038) 0:01:03.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.030) 0:01:03.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.032) 0:01:03.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.030) 0:01:03.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.031) 0:01:04.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.046) 0:01:04.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.034) 0:01:04.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.039) 0:01:04.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.033) 0:01:04.181 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.031) 0:01:04.213 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.037) 0:01:04.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:03:14 +0000 (0:00:00.036) 0:01:04.286 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102985.0301216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654102985.0301216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 12994, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654102985.0301216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.368) 0:01:04.655 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.039) 0:01:04.694 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.041) 0:01:04.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.036) 0:01:04.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.032) 0:01:04.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.034) 0:01:04.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.028) 0:01:04.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.029) 0:01:04.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.030) 0:01:04.927 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.036) 0:01:04.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.075) 0:01:05.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.031) 0:01:05.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.030) 0:01:05.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.030) 0:01:05.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.032) 0:01:05.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.041) 0:01:05.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.041) 0:01:05.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.031) 0:01:05.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.029) 0:01:05.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.029) 0:01:05.339 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.030) 0:01:05.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.032) 0:01:05.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.033) 0:01:05.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:03:15 +0000 (0:00:00.030) 0:01:05.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.031) 0:01:05.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.029) 0:01:05.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.029) 0:01:05.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.030) 0:01:05.589 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.383) 0:01:05.972 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.383) 0:01:06.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.038) 0:01:06.394 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.035) 0:01:06.430 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:03:16 +0000 (0:00:00.031) 0:01:06.462 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.036) 0:01:06.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.031) 0:01:06.530 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.030) 0:01:06.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.030) 0:01:06.591 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.036) 0:01:06.628 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.033) 0:01:06.661 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.038) 0:01:06.700 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038501", "end": "2022-06-01 13:03:17.012867", "rc": 0, "start": "2022-06-01 13:03:16.974366" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.416) 0:01:07.116 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.038) 0:01:07.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.041) 0:01:07.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.032) 0:01:07.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.033) 0:01:07.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.032) 0:01:07.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.032) 0:01:07.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.039) 0:01:07.366 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.034) 0:01:07.400 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.029) 0:01:07.429 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:03:17 +0000 (0:00:00.034) 0:01:07.463 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the pool] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:173 Wednesday 01 June 2022 17:03:18 +0000 (0:00:00.389) 0:01:07.853 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:03:18 +0000 (0:00:00.048) 0:01:07.902 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:03:18 +0000 (0:00:00.041) 0:01:07.943 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:03:18 +0000 (0:00:00.511) 0:01:08.455 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.073) 0:01:08.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.032) 0:01:08.560 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.030) 0:01:08.591 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.061) 0:01:08.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.024) 0:01:08.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.029) 0:01:08.706 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.036) 0:01:08.743 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.033) 0:01:08.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.030) 0:01:08.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.032) 0:01:08.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.029) 0:01:08.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.029) 0:01:08.898 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:03:19 +0000 (0:00:00.043) 0:01:08.942 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:03:20 +0000 (0:00:00.693) 0:01:09.635 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove and recreate existing pool 'foo' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:03:21 +0000 (0:00:01.210) 0:01:10.846 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': 512, u'encryption_key': None, u'encryption_luks_version': u'luks1', u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'4g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': u'serpent-xts-plain64'}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove and recreate existing pool 'foo' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:03:21 +0000 (0:00:00.041) 0:01:10.887 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:194 Wednesday 01 June 2022 17:03:22 +0000 (0:00:00.694) 0:01:11.582 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:200 Wednesday 01 June 2022 17:03:22 +0000 (0:00:00.038) 0:01:11.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:03:22 +0000 (0:00:00.035) 0:01:11.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102997.7531216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102997.7531216, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654102997.7531216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "49185110", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:03:22 +0000 (0:00:00.382) 0:01:12.039 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the pool] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:211 Wednesday 01 June 2022 17:03:22 +0000 (0:00:00.040) 0:01:12.079 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:03:22 +0000 (0:00:00.057) 0:01:12.137 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:03:22 +0000 (0:00:00.046) 0:01:12.184 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.544) 0:01:12.729 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.121) 0:01:12.850 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.032) 0:01:12.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.031) 0:01:12.915 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.064) 0:01:12.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.025) 0:01:13.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.028) 0:01:13.034 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.036) 0:01:13.070 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.035) 0:01:13.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.031) 0:01:13.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.035) 0:01:13.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.031) 0:01:13.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.032) 0:01:13.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:03:23 +0000 (0:00:00.044) 0:01:13.280 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:03:24 +0000 (0:00:00.678) 0:01:13.958 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:03:32 +0000 (0:00:08.266) 0:01:22.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:03:32 +0000 (0:00:00.034) 0:01:22.259 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:03:33 +0000 (0:00:00.706) 0:01:22.966 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:03:33 +0000 (0:00:00.043) 0:01:23.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:03:33 +0000 (0:00:00.037) 0:01:23.048 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:03:33 +0000 (0:00:00.033) 0:01:23.081 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:03:33 +0000 (0:00:00.372) 0:01:23.454 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:03:34 +0000 (0:00:00.675) 0:01:24.130 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:03:35 +0000 (0:00:00.518) 0:01:24.648 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:03:36 +0000 (0:00:00.853) 0:01:25.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654102991.1151216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654102988.8921216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21709, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654102988.8911216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3757759003", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:03:36 +0000 (0:00:00.386) 0:01:25.888 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:03:36 +0000 (0:00:00.425) 0:01:26.313 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:230 Wednesday 01 June 2022 17:03:37 +0000 (0:00:00.842) 0:01:27.155 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:03:37 +0000 (0:00:00.052) 0:01:27.208 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:03:37 +0000 (0:00:00.042) 0:01:27.250 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:03:37 +0000 (0:00:00.030) 0:01:27.280 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "813951ce-3555-43f4-bb09-66f76faef81b" }, "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f": { "fstype": "LVM2_member", "label": "", "name": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "size": "10G", "type": "crypt", "uuid": "6Ipvsa-G02u-BekP-2lrV-SJGd-UIPT-u88iEo" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:03:38 +0000 (0:00:00.376) 0:01:27.657 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002702", "end": "2022-06-01 13:03:37.913490", "rc": 0, "start": "2022-06-01 13:03:37.910788" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:03:38 +0000 (0:00:00.356) 0:01:28.014 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002549", "end": "2022-06-01 13:03:38.284658", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:03:38.282109" } STDOUT: luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:03:38 +0000 (0:00:00.372) 0:01:28.386 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:03:38 +0000 (0:00:00.066) 0:01:28.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.032) 0:01:28.486 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.064) 0:01:28.550 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.075) 0:01:28.625 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "pv": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.392) 0:01:29.018 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.042) 0:01:29.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.038) 0:01:29.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "crypt" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.035) 0:01:29.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.030) 0:01:29.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.032) 0:01:29.198 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.042) 0:01:29.240 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.056) 0:01:29.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.033) 0:01:29.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.034) 0:01:29.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.030) 0:01:29.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.031) 0:01:29.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:03:39 +0000 (0:00:00.031) 0:01:29.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.038) 0:01:29.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.034) 0:01:29.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.035) 0:01:29.567 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.055) 0:01:29.623 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.060) 0:01:29.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.034) 0:01:29.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.032) 0:01:29.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.031) 0:01:29.781 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.062) 0:01:29.844 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.038) 0:01:29.882 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml for /cache/rhel-x.qcow2 TASK [Get the backing device path] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:1 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.058) 0:01:29.940 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/disk/by-uuid/bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" ], "delta": "0:00:00.002664", "end": "2022-06-01 13:03:40.223127", "rc": 0, "start": "2022-06-01 13:03:40.220463" } STDOUT: /dev/sda TASK [Collect LUKS info for this member] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:6 Wednesday 01 June 2022 17:03:40 +0000 (0:00:00.385) 0:01:30.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011822", "end": "2022-06-01 13:03:40.610922", "rc": 0, "start": "2022-06-01 13:03:40.599100" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 59 ec 43 cc c8 30 cb ec e8 a6 ed 1e bb a9 eb 4f fd e2 59 c8 MK salt: 76 09 3e 6e 76 fc 0d 5a dd 92 40 2c 3f db d1 15 bf d0 83 e4 3a 39 dd 2e 13 53 95 a8 7c 57 33 19 MK iterations: 96518 UUID: bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f Key Slot 0: ENABLED Iterations: 1558062 Salt: f1 08 11 51 9b 46 de d3 e3 84 6e 8c 84 9f 76 20 7e 62 3a 38 91 43 56 2e d3 70 7a 18 01 b6 7e 6f Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:12 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.388) 0:01:30.714 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:18 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.041) 0:01:30.755 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:24 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.038) 0:01:30.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.038) 0:01:30.832 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.063) 0:01:30.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f /dev/sda -" ] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.035) 0:01:30.931 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.035) 0:01:30.967 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.037) 0:01:31.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.031) 0:01:31.036 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.036) 0:01:31.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.032) 0:01:31.105 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.031) 0:01:31.137 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.065) 0:01:31.203 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.123) 0:01:31.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.032) 0:01:31.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.030) 0:01:31.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.033) 0:01:31.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:03:41 +0000 (0:00:00.031) 0:01:31.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.031) 0:01:31.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.032) 0:01:31.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.034) 0:01:31.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.034) 0:01:31.589 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.036) 0:01:31.625 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.062) 0:01:31.688 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.035) 0:01:31.724 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.123) 0:01:31.847 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.037) 0:01:31.885 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "813951ce-3555-43f4-bb09-66f76faef81b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "813951ce-3555-43f4-bb09-66f76faef81b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.042) 0:01:31.928 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.037) 0:01:31.966 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.045) 0:01:32.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.046) 0:01:32.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.032) 0:01:32.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.034) 0:01:32.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.033) 0:01:32.158 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.034) 0:01:32.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.048) 0:01:32.240 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.033) 0:01:32.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.037) 0:01:32.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.031) 0:01:32.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.039) 0:01:32.383 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.038) 0:01:32.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:03:42 +0000 (0:00:00.035) 0:01:32.457 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103012.0451214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103012.0451214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 13152, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103012.0451214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.388) 0:01:32.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.038) 0:01:32.883 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.038) 0:01:32.921 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.033) 0:01:32.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.030) 0:01:32.986 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.035) 0:01:33.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.029) 0:01:33.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.029) 0:01:33.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.030) 0:01:33.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.036) 0:01:33.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.031) 0:01:33.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.030) 0:01:33.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.030) 0:01:33.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.030) 0:01:33.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.034) 0:01:33.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.039) 0:01:33.343 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.035) 0:01:33.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.031) 0:01:33.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.031) 0:01:33.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:03:43 +0000 (0:00:00.031) 0:01:33.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.034) 0:01:33.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.079) 0:01:33.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.031) 0:01:33.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.032) 0:01:33.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.031) 0:01:33.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.029) 0:01:33.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.032) 0:01:33.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.034) 0:01:33.781 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:03:44 +0000 (0:00:00.388) 0:01:34.169 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.366) 0:01:34.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.038) 0:01:34.574 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.033) 0:01:34.608 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.030) 0:01:34.638 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.031) 0:01:34.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.030) 0:01:34.701 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.044) 0:01:34.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.036) 0:01:34.781 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.036) 0:01:34.818 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.036) 0:01:34.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.039) 0:01:34.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037959", "end": "2022-06-01 13:03:45.215617", "rc": 0, "start": "2022-06-01 13:03:45.177658" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.424) 0:01:35.319 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.056) 0:01:35.376 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.046) 0:01:35.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:03:45 +0000 (0:00:00.034) 0:01:35.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.032) 0:01:35.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.033) 0:01:35.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.032) 0:01:35.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.030) 0:01:35.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.033) 0:01:35.619 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.029) 0:01:35.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.030) 0:01:35.678 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Change the mountpoint, leaving encryption in place] ********************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:234 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.398) 0:01:36.077 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.070) 0:01:36.147 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:03:46 +0000 (0:00:00.048) 0:01:36.195 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.531) 0:01:36.727 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.073) 0:01:36.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.032) 0:01:36.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.076) 0:01:36.910 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.063) 0:01:36.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.027) 0:01:37.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.032) 0:01:37.034 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test2", "name": "test1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.037) 0:01:37.071 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.032) 0:01:37.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.030) 0:01:37.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.032) 0:01:37.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.030) 0:01:37.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.033) 0:01:37.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:03:47 +0000 (0:00:00.045) 0:01:37.277 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:03:48 +0000 (0:00:00.705) 0:01:37.982 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:03:49 +0000 (0:00:01.363) 0:01:39.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:03:49 +0000 (0:00:00.032) 0:01:39.378 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:03:50 +0000 (0:00:00.686) 0:01:40.065 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:03:50 +0000 (0:00:00.039) 0:01:40.104 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:03:50 +0000 (0:00:00.041) 0:01:40.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:03:50 +0000 (0:00:00.037) 0:01:40.183 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:03:51 +0000 (0:00:00.404) 0:01:40.587 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:03:51 +0000 (0:00:00.690) 0:01:41.278 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:03:52 +0000 (0:00:00.421) 0:01:41.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:03:52 +0000 (0:00:00.680) 0:01:42.379 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103018.2841215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b52a64e05fd9717766425376aeae2c08238d8609", "ctime": 1654103016.2001214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792390, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103016.1991215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "4082670406", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:03:53 +0000 (0:00:00.398) 0:01:42.778 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:03:53 +0000 (0:00:00.030) 0:01:42.809 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert to implicitly preserve encryption on existing pool] *************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:246 Wednesday 01 June 2022 17:03:54 +0000 (0:00:00.897) 0:01:43.706 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:03:54 +0000 (0:00:00.037) 0:01:43.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103025.9701216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103025.9701216, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103025.9701216, "nlink": 1, "path": "/opt/test2/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3073992638", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:03:54 +0000 (0:00:00.382) 0:01:44.126 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:255 Wednesday 01 June 2022 17:03:54 +0000 (0:00:00.038) 0:01:44.165 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:03:54 +0000 (0:00:00.054) 0:01:44.219 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:03:54 +0000 (0:00:00.040) 0:01:44.260 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:03:54 +0000 (0:00:00.031) 0:01:44.291 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "813951ce-3555-43f4-bb09-66f76faef81b" }, "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f": { "fstype": "LVM2_member", "label": "", "name": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "size": "10G", "type": "crypt", "uuid": "6Ipvsa-G02u-BekP-2lrV-SJGd-UIPT-u88iEo" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:03:55 +0000 (0:00:00.385) 0:01:44.677 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002522", "end": "2022-06-01 13:03:54.973959", "rc": 0, "start": "2022-06-01 13:03:54.971437" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:03:55 +0000 (0:00:00.396) 0:01:45.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002691", "end": "2022-06-01 13:03:55.360215", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:03:55.357524" } STDOUT: luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:03:55 +0000 (0:00:00.391) 0:01:45.465 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.076) 0:01:45.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.034) 0:01:45.577 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.063) 0:01:45.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.038) 0:01:45.679 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "pv": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.368) 0:01:46.047 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.045) 0:01:46.092 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.039) 0:01:46.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "crypt" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.036) 0:01:46.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.032) 0:01:46.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.032) 0:01:46.232 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.047) 0:01:46.279 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.056) 0:01:46.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.031) 0:01:46.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:03:56 +0000 (0:00:00.030) 0:01:46.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.083) 0:01:46.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.033) 0:01:46.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.031) 0:01:46.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.034) 0:01:46.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.036) 0:01:46.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.033) 0:01:46.650 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.063) 0:01:46.714 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.063) 0:01:46.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.031) 0:01:46.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.032) 0:01:46.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.030) 0:01:46.872 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.063) 0:01:46.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.037) 0:01:46.974 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml for /cache/rhel-x.qcow2 TASK [Get the backing device path] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:1 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.059) 0:01:47.033 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/disk/by-uuid/bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" ], "delta": "0:00:00.002749", "end": "2022-06-01 13:03:57.315210", "rc": 0, "start": "2022-06-01 13:03:57.312461" } STDOUT: /dev/sda TASK [Collect LUKS info for this member] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:6 Wednesday 01 June 2022 17:03:57 +0000 (0:00:00.387) 0:01:47.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011613", "end": "2022-06-01 13:03:57.709954", "rc": 0, "start": "2022-06-01 13:03:57.698341" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 59 ec 43 cc c8 30 cb ec e8 a6 ed 1e bb a9 eb 4f fd e2 59 c8 MK salt: 76 09 3e 6e 76 fc 0d 5a dd 92 40 2c 3f db d1 15 bf d0 83 e4 3a 39 dd 2e 13 53 95 a8 7c 57 33 19 MK iterations: 96518 UUID: bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f Key Slot 0: ENABLED Iterations: 1558062 Salt: f1 08 11 51 9b 46 de d3 e3 84 6e 8c 84 9f 76 20 7e 62 3a 38 91 43 56 2e d3 70 7a 18 01 b6 7e 6f Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:12 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.392) 0:01:47.813 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:18 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.041) 0:01:47.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:24 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.031) 0:01:47.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.030) 0:01:47.916 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.064) 0:01:47.980 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f /dev/sda -" ] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.035) 0:01:48.016 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.035) 0:01:48.051 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.038) 0:01:48.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.029) 0:01:48.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.039) 0:01:48.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.033) 0:01:48.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.031) 0:01:48.223 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.062) 0:01:48.286 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.065) 0:01:48.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.031) 0:01:48.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.030) 0:01:48.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:03:58 +0000 (0:00:00.031) 0:01:48.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.032) 0:01:48.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.030) 0:01:48.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.030) 0:01:48.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.030) 0:01:48.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.032) 0:01:48.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.031) 0:01:48.633 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.062) 0:01:48.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.034) 0:01:48.730 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.158) 0:01:48.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.036) 0:01:48.925 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097148, "inode_total": 2097152, "inode_used": 4, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "813951ce-3555-43f4-bb09-66f76faef81b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097148, "inode_total": 2097152, "inode_used": 4, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "813951ce-3555-43f4-bb09-66f76faef81b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.042) 0:01:48.968 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.039) 0:01:49.007 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.035) 0:01:49.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.038) 0:01:49.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.030) 0:01:49.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.030) 0:01:49.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.031) 0:01:49.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.033) 0:01:49.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.046) 0:01:49.254 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.035) 0:01:49.289 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.038) 0:01:49.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.032) 0:01:49.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.034) 0:01:49.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.038) 0:01:49.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:03:59 +0000 (0:00:00.038) 0:01:49.473 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103012.0451214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103012.0451214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 13152, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103012.0451214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.384) 0:01:49.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.038) 0:01:49.896 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.037) 0:01:49.933 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.033) 0:01:49.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.032) 0:01:49.998 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.035) 0:01:50.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.032) 0:01:50.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.032) 0:01:50.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.031) 0:01:50.130 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.039) 0:01:50.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.038) 0:01:50.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.032) 0:01:50.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.031) 0:01:50.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.034) 0:01:50.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.032) 0:01:50.340 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.038) 0:01:50.379 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.038) 0:01:50.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:04:00 +0000 (0:00:00.030) 0:01:50.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.031) 0:01:50.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.031) 0:01:50.511 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.034) 0:01:50.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.032) 0:01:50.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.033) 0:01:50.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.032) 0:01:50.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.031) 0:01:50.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.032) 0:01:50.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.032) 0:01:50.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.031) 0:01:50.770 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:04:01 +0000 (0:00:00.410) 0:01:51.180 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.400) 0:01:51.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.040) 0:01:51.620 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.080) 0:01:51.701 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.031) 0:01:51.733 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.033) 0:01:51.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.030) 0:01:51.797 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.031) 0:01:51.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.031) 0:01:51.860 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.037) 0:01:51.898 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.033) 0:01:51.931 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.039) 0:01:51.971 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.043667", "end": "2022-06-01 13:04:02.302891", "rc": 0, "start": "2022-06-01 13:04:02.259224" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.441) 0:01:52.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:04:02 +0000 (0:00:00.040) 0:01:52.453 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.039) 0:01:52.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.034) 0:01:52.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.034) 0:01:52.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.034) 0:01:52.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.034) 0:01:52.631 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.032) 0:01:52.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.031) 0:01:52.695 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.031) 0:01:52.727 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:257 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.031) 0:01:52.758 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.069) 0:01:52.828 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.047) 0:01:52.875 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.524) 0:01:53.400 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:04:03 +0000 (0:00:00.073) 0:01:53.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.032) 0:01:53.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.033) 0:01:53.540 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.063) 0:01:53.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.026) 0:01:53.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.041) 0:01:53.672 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.035) 0:01:53.707 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.035) 0:01:53.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.032) 0:01:53.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.030) 0:01:53.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.029) 0:01:53.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.033) 0:01:53.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:04:04 +0000 (0:00:00.049) 0:01:53.920 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:04:05 +0000 (0:00:00.765) 0:01:54.685 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_mount_id": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:04:07 +0000 (0:00:01.943) 0:01:56.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:04:07 +0000 (0:00:00.031) 0:01:56.661 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d662f5f4f\x2dad4e\x2d4732\x2da294\x2d8e3657855c6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "name": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d662f5f4f\\x2dad4e\\x2d4732\\x2da294\\x2d8e3657855c6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d662f5f4f\\\\x2dad4e\\\\x2d4732\\\\x2da294\\\\x2d8e3657855c6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:04:07 +0000 (0:00:00.676) 0:01:57.338 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_mount_id": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:04:07 +0000 (0:00:00.048) 0:01:57.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:04:07 +0000 (0:00:00.038) 0:01:57.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_mount_id": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:04:07 +0000 (0:00:00.037) 0:01:57.463 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:04:08 +0000 (0:00:00.379) 0:01:57.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:04:09 +0000 (0:00:00.638) 0:01:58.480 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:04:09 +0000 (0:00:00.035) 0:01:58.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:04:09 +0000 (0:00:00.663) 0:01:59.180 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103018.2841215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b52a64e05fd9717766425376aeae2c08238d8609", "ctime": 1654103016.2001214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792390, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103016.1991215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "4082670406", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:04:10 +0000 (0:00:00.389) 0:01:59.569 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:04:10 +0000 (0:00:00.375) 0:01:59.945 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:267 Wednesday 01 June 2022 17:04:11 +0000 (0:00:00.877) 0:02:00.822 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:04:11 +0000 (0:00:00.058) 0:02:00.880 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:04:11 +0000 (0:00:00.029) 0:02:00.910 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_mount_id": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:04:11 +0000 (0:00:00.037) 0:02:00.947 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:04:11 +0000 (0:00:00.388) 0:02:01.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003024", "end": "2022-06-01 13:04:11.620036", "rc": 0, "start": "2022-06-01 13:04:11.617012" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.388) 0:02:01.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003253", "end": "2022-06-01 13:04:11.991793", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:04:11.988540" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.374) 0:02:02.097 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.029) 0:02:02.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.031) 0:02:02.158 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.117) 0:02:02.276 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.036) 0:02:02.312 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.122) 0:02:02.434 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:04:12 +0000 (0:00:00.034) 0:02:02.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.040) 0:02:02.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.030) 0:02:02.541 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.036) 0:02:02.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.031) 0:02:02.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.031) 0:02:02.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.032) 0:02:02.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.030) 0:02:02.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.029) 0:02:02.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.046) 0:02:02.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.025) 0:02:02.805 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.035) 0:02:02.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.028) 0:02:02.869 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.031) 0:02:02.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.029) 0:02:02.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.028) 0:02:02.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103046.4621215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103046.4621215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103046.4621215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.392) 0:02:03.351 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.037) 0:02:03.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.027) 0:02:03.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.031) 0:02:03.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:04:13 +0000 (0:00:00.027) 0:02:03.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.025) 0:02:03.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.375) 0:02:03.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.031) 0:02:03.908 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.038) 0:02:03.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.024) 0:02:03.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.029) 0:02:04.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.029) 0:02:04.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.029) 0:02:04.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.028) 0:02:04.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.031) 0:02:04.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.037) 0:02:04.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.034) 0:02:04.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.029) 0:02:04.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.028) 0:02:04.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.030) 0:02:04.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.031) 0:02:04.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.028) 0:02:04.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.028) 0:02:04.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.030) 0:02:04.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.030) 0:02:04.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:04:14 +0000 (0:00:00.031) 0:02:04.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.031) 0:02:04.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.035) 0:02:04.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.033) 0:02:04.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:04.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.029) 0:02:04.623 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.031) 0:02:04.654 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.078) 0:02:04.733 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:04.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.028) 0:02:04.792 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:04.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:04.853 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.033) 0:02:04.887 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.034) 0:02:04.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.029) 0:02:04.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.032) 0:02:04.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:05.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:05.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:05.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.034) 0:02:05.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.031) 0:02:05.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.030) 0:02:05.171 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.029) 0:02:05.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=544 changed=40 unreachable=0 failed=3 skipped=337 rescued=3 ignored=0 Wednesday 01 June 2022 17:04:15 +0000 (0:00:00.016) 0:02:05.218 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state -- 10.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.43s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.36s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.24s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.21s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:2 ---------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : get required packages ---------------------- 1.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Mask the systemd cryptsetup services ------- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 linux-system-roles.storage : Update facts ------------------------------- 0.96s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:04:16 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:04:17 +0000 (0:00:01.320) 0:00:01.343 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml PLAYBOOK: tests_luks_pool_nvme_generated.yml *********************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_luks_pool_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:04:17 +0000 (0:00:00.044) 0:00:01.388 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:04:18 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:04:19 +0000 (0:00:01.249) 0:00:01.271 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml PLAYBOOK: tests_luks_pool_scsi_generated.yml *********************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_luks_pool_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool_scsi_generated.yml:3 Wednesday 01 June 2022 17:04:19 +0000 (0:00:00.039) 0:00:01.311 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool_scsi_generated.yml:7 Wednesday 01 June 2022 17:04:20 +0000 (0:00:01.107) 0:00:02.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:2 Wednesday 01 June 2022 17:04:21 +0000 (0:00:00.027) 0:00:02.446 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:14 Wednesday 01 June 2022 17:04:21 +0000 (0:00:00.845) 0:00:03.291 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:04:21 +0000 (0:00:00.037) 0:00:03.328 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:04:22 +0000 (0:00:00.156) 0:00:03.485 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:04:22 +0000 (0:00:00.528) 0:00:04.013 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:04:22 +0000 (0:00:00.079) 0:00:04.093 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:04:22 +0000 (0:00:00.023) 0:00:04.117 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:04:22 +0000 (0:00:00.022) 0:00:04.139 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:04:22 +0000 (0:00:00.203) 0:00:04.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:04:22 +0000 (0:00:00.019) 0:00:04.362 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:04:24 +0000 (0:00:01.085) 0:00:05.447 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:04:24 +0000 (0:00:00.046) 0:00:05.494 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:04:24 +0000 (0:00:00.044) 0:00:05.539 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:04:24 +0000 (0:00:00.663) 0:00:06.202 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:04:24 +0000 (0:00:00.079) 0:00:06.281 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:04:24 +0000 (0:00:00.020) 0:00:06.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:04:24 +0000 (0:00:00.021) 0:00:06.323 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:04:24 +0000 (0:00:00.020) 0:00:06.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:04:25 +0000 (0:00:00.817) 0:00:07.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service": { "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:04:27 +0000 (0:00:01.861) 0:00:09.022 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:04:27 +0000 (0:00:00.041) 0:00:09.064 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket cryptsetup-pre.target dev-sda.device systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.target\" umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-bd5f7f20-d665-4abc-9bd1-dd0c981eaf5f ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:03:49 EDT", "StateChangeTimestampMonotonic": "2618959324", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:04:28 +0000 (0:00:00.993) 0:00:10.057 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.511) 0:00:10.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.027) 0:00:10.596 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.652) 0:00:11.249 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.034) 0:00:11.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.031) 0:00:11.315 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.033) 0:00:11.349 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.028) 0:00:11.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:04:29 +0000 (0:00:00.028) 0:00:11.406 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:04:30 +0000 (0:00:00.030) 0:00:11.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:04:30 +0000 (0:00:00.028) 0:00:11.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103051.9901216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103049.8361216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21708, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103049.8351214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2808007752", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:04:30 +0000 (0:00:00.497) 0:00:11.962 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:04:30 +0000 (0:00:00.027) 0:00:11.989 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:17 Wednesday 01 June 2022 17:04:31 +0000 (0:00:00.858) 0:00:12.847 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:23 Wednesday 01 June 2022 17:04:31 +0000 (0:00:00.031) 0:00:12.878 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:04:31 +0000 (0:00:00.043) 0:00:12.922 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.524) 0:00:13.447 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.036) 0:00:13.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.029) 0:00:13.512 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create an encrypted lvm pool] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:34 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.032) 0:00:13.545 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.047) 0:00:13.592 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.042) 0:00:13.634 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.573) 0:00:14.207 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.071) 0:00:14.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.035) 0:00:14.315 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.031) 0:00:14.346 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:04:32 +0000 (0:00:00.062) 0:00:14.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:04:33 +0000 (0:00:00.024) 0:00:14.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:04:33 +0000 (0:00:00.028) 0:00:14.462 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:04:33 +0000 (0:00:00.035) 0:00:14.497 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:04:33 +0000 (0:00:00.031) 0:00:14.528 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:04:34 +0000 (0:00:01.054) 0:00:15.583 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:04:34 +0000 (0:00:00.052) 0:00:15.635 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:04:34 +0000 (0:00:00.027) 0:00:15.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:04:34 +0000 (0:00:00.036) 0:00:15.700 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:04:34 +0000 (0:00:00.031) 0:00:15.731 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:04:35 +0000 (0:00:00.871) 0:00:16.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:04:35 +0000 (0:00:00.031) 0:00:16.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:04:35 +0000 (0:00:00.044) 0:00:16.678 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:04:35 +0000 (0:00:00.683) 0:00:17.361 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted pool 'foo' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:04:36 +0000 (0:00:01.051) 0:00:18.412 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': True, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'4g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted pool 'foo' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:04:37 +0000 (0:00:00.038) 0:00:18.451 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:53 Wednesday 01 June 2022 17:04:37 +0000 (0:00:00.719) 0:00:19.170 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:59 Wednesday 01 June 2022 17:04:37 +0000 (0:00:00.035) 0:00:19.206 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:66 Wednesday 01 June 2022 17:04:37 +0000 (0:00:00.034) 0:00:19.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [Create a key file] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:75 Wednesday 01 June 2022 17:04:37 +0000 (0:00:00.030) 0:00:19.271 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testk5i0yh4zlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:82 Wednesday 01 June 2022 17:04:38 +0000 (0:00:00.504) 0:00:19.776 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testk5i0yh4zlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1654103078.4-83105-157596499847809/source", "state": "file", "uid": 0 } TASK [Create an encrypted lvm pool using a key file] *************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:89 Wednesday 01 June 2022 17:04:39 +0000 (0:00:00.835) 0:00:20.611 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:04:39 +0000 (0:00:00.049) 0:00:20.661 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:04:39 +0000 (0:00:00.079) 0:00:20.740 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:04:39 +0000 (0:00:00.517) 0:00:21.258 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:04:39 +0000 (0:00:00.069) 0:00:21.328 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:04:39 +0000 (0:00:00.031) 0:00:21.359 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:04:39 +0000 (0:00:00.028) 0:00:21.388 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.058) 0:00:21.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.024) 0:00:21.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.031) 0:00:21.503 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_key": "/tmp/storage_testk5i0yh4zlukskey", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.036) 0:00:21.539 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.031) 0:00:21.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.029) 0:00:21.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.028) 0:00:21.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.028) 0:00:21.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.030) 0:00:21.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:04:40 +0000 (0:00:00.043) 0:00:21.732 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:04:41 +0000 (0:00:00.694) 0:00:22.426 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "password": "/tmp/storage_testk5i0yh4zlukskey", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testk5i0yh4zlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:04:51 +0000 (0:00:10.881) 0:00:33.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:04:51 +0000 (0:00:00.030) 0:00:33.339 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:04:52 +0000 (0:00:00.736) 0:00:34.076 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "password": "/tmp/storage_testk5i0yh4zlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testk5i0yh4zlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:04:52 +0000 (0:00:00.043) 0:00:34.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testk5i0yh4zlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:04:52 +0000 (0:00:00.036) 0:00:34.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:04:52 +0000 (0:00:00.032) 0:00:34.188 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:04:52 +0000 (0:00:00.028) 0:00:34.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:04:53 +0000 (0:00:00.641) 0:00:34.859 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:04:53 +0000 (0:00:00.558) 0:00:35.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:04:54 +0000 (0:00:00.674) 0:00:36.091 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103051.9901216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103049.8361216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21708, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103049.8351214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2808007752", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:04:55 +0000 (0:00:00.390) 0:00:36.482 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'/tmp/storage_testk5i0yh4zlukskey', u'name': u'luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "password": "/tmp/storage_testk5i0yh4zlukskey", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:04:55 +0000 (0:00:00.513) 0:00:36.996 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:104 Wednesday 01 June 2022 17:04:56 +0000 (0:00:00.894) 0:00:37.891 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:04:56 +0000 (0:00:00.053) 0:00:37.944 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testk5i0yh4zlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:04:56 +0000 (0:00:00.041) 0:00:37.985 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:04:56 +0000 (0:00:00.030) 0:00:38.015 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "1640faea-fb6a-4c45-8135-828f25afecd8" }, "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16": { "fstype": "LVM2_member", "label": "", "name": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "size": "10G", "type": "crypt", "uuid": "3RahV0-qDxw-mHuJ-FV0K-EOFa-AA6C-OLu3IF" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "84d2afbd-152e-4102-b8d4-0ced7cd94c16" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:04:57 +0000 (0:00:00.565) 0:00:38.580 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002738", "end": "2022-06-01 13:04:57.029738", "rc": 0, "start": "2022-06-01 13:04:57.027000" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:04:57 +0000 (0:00:00.500) 0:00:39.081 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002674", "end": "2022-06-01 13:04:57.414907", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:04:57.412233" } STDOUT: luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16 /dev/sda /tmp/storage_testk5i0yh4zlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.385) 0:00:39.466 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.064) 0:00:39.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.032) 0:00:39.563 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.065) 0:00:39.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.039) 0:00:39.668 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "pv": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.503) 0:00:40.171 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.079) 0:00:40.251 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.039) 0:00:40.291 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "crypt" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.035) 0:00:40.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.031) 0:00:40.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:04:58 +0000 (0:00:00.030) 0:00:40.388 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.044) 0:00:40.433 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.057) 0:00:40.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.031) 0:00:40.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.032) 0:00:40.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.032) 0:00:40.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.030) 0:00:40.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.030) 0:00:40.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.029) 0:00:40.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.030) 0:00:40.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.031) 0:00:40.740 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.061) 0:00:40.801 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.064) 0:00:40.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.034) 0:00:40.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.032) 0:00:40.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.029) 0:00:40.963 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.063) 0:00:41.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testk5i0yh4zlukskey" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.036) 0:00:41.062 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml for /cache/rhel-x.qcow2 TASK [Get the backing device path] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:1 Wednesday 01 June 2022 17:04:59 +0000 (0:00:00.060) 0:00:41.123 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/disk/by-uuid/84d2afbd-152e-4102-b8d4-0ced7cd94c16" ], "delta": "0:00:00.003541", "end": "2022-06-01 13:04:59.446740", "rc": 0, "start": "2022-06-01 13:04:59.443199" } STDOUT: /dev/sda TASK [Collect LUKS info for this member] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:6 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.381) 0:00:41.504 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011995", "end": "2022-06-01 13:04:59.831453", "rc": 0, "start": "2022-06-01 13:04:59.819458" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 84d2afbd-152e-4102-b8d4-0ced7cd94c16 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 10 Memory: 906462 Threads: 4 Salt: f8 8b 69 1e d5 5b 72 b2 2e 15 3c a0 7b 44 aa 95 04 f3 9d 46 fd e4 40 03 5a e8 97 3e 8c 1a 8d 35 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 95953 Salt: ef 96 29 7a 2e e6 4e fe e6 fe b2 a4 9c ab 37 0d ef fd 6e 59 c5 b1 21 4d f4 b2 e4 77 91 0b 65 c6 Digest: cd 6e b9 04 cb 44 71 f6 0d bf 89 3b 8f d4 89 5b 72 eb 2b 3e 83 ea da b6 7d 8e 65 16 bd ef 27 64 TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:12 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.380) 0:00:41.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:18 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.032) 0:00:41.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:24 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.031) 0:00:41.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.031) 0:00:41.979 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.064) 0:00:42.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16 /dev/sda /tmp/storage_testk5i0yh4zlukskey" ] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.040) 0:00:42.085 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.037) 0:00:42.122 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.040) 0:00:42.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.030) 0:00:42.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.039) 0:00:42.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.029) 0:00:42.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.027) 0:00:42.290 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.063) 0:00:42.353 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:05:00 +0000 (0:00:00.069) 0:00:42.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.031) 0:00:42.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.030) 0:00:42.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.028) 0:00:42.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.074) 0:00:42.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.032) 0:00:42.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.032) 0:00:42.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.029) 0:00:42.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.030) 0:00:42.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.030) 0:00:42.744 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.061) 0:00:42.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.034) 0:00:42.841 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.140) 0:00:42.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.037) 0:00:43.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "1640faea-fb6a-4c45-8135-828f25afecd8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "1640faea-fb6a-4c45-8135-828f25afecd8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.047) 0:00:43.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.040) 0:00:43.108 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.037) 0:00:43.146 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.038) 0:00:43.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.030) 0:00:43.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.032) 0:00:43.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.030) 0:00:43.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.031) 0:00:43.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.046) 0:00:43.356 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:05:01 +0000 (0:00:00.035) 0:00:43.392 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.037) 0:00:43.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.032) 0:00:43.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.031) 0:00:43.493 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.037) 0:00:43.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.037) 0:00:43.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103091.1901214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103091.1901214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 13409, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103091.1901214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.391) 0:00:43.959 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.045) 0:00:44.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.037) 0:00:44.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.037) 0:00:44.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.032) 0:00:44.112 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.036) 0:00:44.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.031) 0:00:44.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.030) 0:00:44.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.030) 0:00:44.240 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.040) 0:00:44.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.036) 0:00:44.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.031) 0:00:44.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.030) 0:00:44.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:05:02 +0000 (0:00:00.030) 0:00:44.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.029) 0:00:44.441 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.039) 0:00:44.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.036) 0:00:44.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.030) 0:00:44.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.030) 0:00:44.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.030) 0:00:44.608 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.031) 0:00:44.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.033) 0:00:44.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.031) 0:00:44.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.030) 0:00:44.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.031) 0:00:44.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.034) 0:00:44.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.036) 0:00:44.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.085) 0:00:44.924 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:05:03 +0000 (0:00:00.480) 0:00:45.405 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.379) 0:00:45.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.037) 0:00:45.821 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.033) 0:00:45.855 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.030) 0:00:45.886 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.030) 0:00:45.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.031) 0:00:45.948 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.033) 0:00:45.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.031) 0:00:46.012 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.034) 0:00:46.046 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.032) 0:00:46.079 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:05:04 +0000 (0:00:00.038) 0:00:46.117 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.034962", "end": "2022-06-01 13:05:04.471690", "rc": 0, "start": "2022-06-01 13:05:04.436728" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.406) 0:00:46.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.040) 0:00:46.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.041) 0:00:46.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.033) 0:00:46.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.033) 0:00:46.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.033) 0:00:46.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.030) 0:00:46.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.031) 0:00:46.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.033) 0:00:46.800 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.027) 0:00:46.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:106 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.030) 0:00:46.858 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "path": "/tmp/storage_testk5i0yh4zlukskey", "state": "absent" } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:05:05 +0000 (0:00:00.524) 0:00:47.383 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:116 Wednesday 01 June 2022 17:05:06 +0000 (0:00:00.389) 0:00:47.773 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:05:06 +0000 (0:00:00.050) 0:00:47.823 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:05:06 +0000 (0:00:00.045) 0:00:47.868 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:05:06 +0000 (0:00:00.527) 0:00:48.395 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.075) 0:00:48.470 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.032) 0:00:48.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.029) 0:00:48.532 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.062) 0:00:48.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.028) 0:00:48.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.029) 0:00:48.653 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.037) 0:00:48.691 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.033) 0:00:48.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.030) 0:00:48.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.030) 0:00:48.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.070) 0:00:48.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.031) 0:00:48.887 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:05:07 +0000 (0:00:00.044) 0:00:48.932 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:05:08 +0000 (0:00:00.719) 0:00:49.651 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove and recreate existing pool 'foo' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:05:09 +0000 (0:00:01.306) 0:00:50.958 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': 0, u'encryption_key': None, u'encryption_luks_version': u'luks2', u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'4g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove and recreate existing pool 'foo' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:05:09 +0000 (0:00:00.049) 0:00:51.008 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:134 Wednesday 01 June 2022 17:05:10 +0000 (0:00:00.709) 0:00:51.718 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:140 Wednesday 01 June 2022 17:05:10 +0000 (0:00:00.038) 0:00:51.757 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:05:10 +0000 (0:00:00.037) 0:00:51.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103105.7161214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103105.7161214, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103105.7161214, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3942106770", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:05:10 +0000 (0:00:00.395) 0:00:52.189 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:151 Wednesday 01 June 2022 17:05:10 +0000 (0:00:00.038) 0:00:52.228 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:05:10 +0000 (0:00:00.052) 0:00:52.281 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:05:10 +0000 (0:00:00.045) 0:00:52.327 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.519) 0:00:52.846 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.073) 0:00:52.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.032) 0:00:52.952 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.031) 0:00:52.984 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.065) 0:00:53.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.026) 0:00:53.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.031) 0:00:53.107 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.038) 0:00:53.145 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.035) 0:00:53.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.030) 0:00:53.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.030) 0:00:53.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.029) 0:00:53.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.030) 0:00:53.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:05:11 +0000 (0:00:00.045) 0:00:53.347 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:05:12 +0000 (0:00:00.763) 0:00:54.110 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:05:15 +0000 (0:00:02.536) 0:00:56.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:05:15 +0000 (0:00:00.033) 0:00:56.681 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:05:15 +0000 (0:00:00.733) 0:00:57.414 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:05:16 +0000 (0:00:00.045) 0:00:57.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:05:16 +0000 (0:00:00.038) 0:00:57.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:05:16 +0000 (0:00:00.035) 0:00:57.534 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:05:16 +0000 (0:00:00.375) 0:00:57.909 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:05:17 +0000 (0:00:00.690) 0:00:58.600 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:05:17 +0000 (0:00:00.430) 0:00:59.030 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:05:18 +0000 (0:00:00.679) 0:00:59.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103097.4141216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a517c8760201fbad1424d7149b41b552671402ac", "ctime": 1654103094.9271214, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417712, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103094.9261215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 84, "uid": 0, "version": "2787264733", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:05:18 +0000 (0:00:00.396) 0:01:00.106 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-84d2afbd-152e-4102-b8d4-0ced7cd94c16", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:05:19 +0000 (0:00:00.408) 0:01:00.514 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:167 Wednesday 01 June 2022 17:05:19 +0000 (0:00:00.884) 0:01:01.398 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:05:20 +0000 (0:00:00.059) 0:01:01.458 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:05:20 +0000 (0:00:00.043) 0:01:01.501 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:05:20 +0000 (0:00:00.031) 0:01:01.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "c11ea316-c90e-4f5e-8a72-a2ea07b2d535" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "dE0XEz-NMTF-pSbZ-sAA0-peDB-mDUB-R23NFG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:05:20 +0000 (0:00:00.386) 0:01:01.919 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002664", "end": "2022-06-01 13:05:20.242229", "rc": 0, "start": "2022-06-01 13:05:20.239565" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:05:20 +0000 (0:00:00.375) 0:01:02.294 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002882", "end": "2022-06-01 13:05:20.615917", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:05:20.613035" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.377) 0:01:02.672 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.116) 0:01:02.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.032) 0:01:02.821 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.060) 0:01:02.882 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.038) 0:01:02.921 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.412) 0:01:03.333 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.044) 0:01:03.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:05:21 +0000 (0:00:00.038) 0:01:03.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.037) 0:01:03.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.036) 0:01:03.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.028) 0:01:03.518 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.041) 0:01:03.559 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.064) 0:01:03.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.031) 0:01:03.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.031) 0:01:03.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.031) 0:01:03.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.034) 0:01:03.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.031) 0:01:03.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.034) 0:01:03.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.032) 0:01:03.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.033) 0:01:03.885 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.064) 0:01:03.950 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.072) 0:01:04.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.033) 0:01:04.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.033) 0:01:04.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.034) 0:01:04.123 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.065) 0:01:04.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.037) 0:01:04.226 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.037) 0:01:04.264 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.068) 0:01:04.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.036) 0:01:04.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:05:22 +0000 (0:00:00.036) 0:01:04.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.030) 0:01:04.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.030) 0:01:04.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.034) 0:01:04.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.032) 0:01:04.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.032) 0:01:04.565 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.064) 0:01:04.630 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.071) 0:01:04.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.031) 0:01:04.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.031) 0:01:04.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.030) 0:01:04.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.031) 0:01:04.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.030) 0:01:04.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.029) 0:01:04.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.029) 0:01:04.917 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.030) 0:01:04.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.031) 0:01:04.978 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.117) 0:01:05.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.044) 0:01:05.141 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.125) 0:01:05.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.035) 0:01:05.302 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "c11ea316-c90e-4f5e-8a72-a2ea07b2d535" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "c11ea316-c90e-4f5e-8a72-a2ea07b2d535" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.042) 0:01:05.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.041) 0:01:05.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:05:23 +0000 (0:00:00.035) 0:01:05.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.039) 0:01:05.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.031) 0:01:05.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.029) 0:01:05.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.029) 0:01:05.551 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.033) 0:01:05.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.046) 0:01:05.631 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.034) 0:01:05.666 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.036) 0:01:05.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.029) 0:01:05.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.030) 0:01:05.763 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.039) 0:01:05.803 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.036) 0:01:05.840 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103114.5171216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103114.5171216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 13547, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103114.5171216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.389) 0:01:06.230 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.037) 0:01:06.267 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.035) 0:01:06.302 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.032) 0:01:06.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.030) 0:01:06.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:05:24 +0000 (0:00:00.037) 0:01:06.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.029) 0:01:06.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.030) 0:01:06.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.028) 0:01:06.491 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.036) 0:01:06.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.029) 0:01:06.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.033) 0:01:06.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.030) 0:01:06.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.031) 0:01:06.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.031) 0:01:06.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.037) 0:01:06.722 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.035) 0:01:06.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.032) 0:01:06.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.030) 0:01:06.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.030) 0:01:06.851 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.030) 0:01:06.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.029) 0:01:06.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.029) 0:01:06.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.031) 0:01:06.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.030) 0:01:07.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.031) 0:01:07.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.030) 0:01:07.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:05:25 +0000 (0:00:00.031) 0:01:07.097 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.367) 0:01:07.464 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.378) 0:01:07.842 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.037) 0:01:07.880 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.034) 0:01:07.915 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.031) 0:01:07.947 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.029) 0:01:07.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.031) 0:01:08.008 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.030) 0:01:08.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.034) 0:01:08.072 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.033) 0:01:08.106 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.037) 0:01:08.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:05:26 +0000 (0:00:00.043) 0:01:08.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036487", "end": "2022-06-01 13:05:26.554813", "rc": 0, "start": "2022-06-01 13:05:26.518326" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.427) 0:01:08.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.041) 0:01:08.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.041) 0:01:08.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.032) 0:01:08.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.034) 0:01:08.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.037) 0:01:08.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.032) 0:01:08.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.031) 0:01:08.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.031) 0:01:08.897 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.027) 0:01:08.925 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.030) 0:01:08.955 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the pool] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:173 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.409) 0:01:09.365 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:05:27 +0000 (0:00:00.050) 0:01:09.416 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.045) 0:01:09.461 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.510) 0:01:09.972 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.073) 0:01:10.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.032) 0:01:10.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.031) 0:01:10.109 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.063) 0:01:10.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.026) 0:01:10.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.031) 0:01:10.230 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.036) 0:01:10.267 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.035) 0:01:10.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.033) 0:01:10.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.032) 0:01:10.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:05:28 +0000 (0:00:00.032) 0:01:10.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:05:29 +0000 (0:00:00.030) 0:01:10.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:05:29 +0000 (0:00:00.046) 0:01:10.477 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:05:29 +0000 (0:00:00.699) 0:01:11.177 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove and recreate existing pool 'foo' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:05:31 +0000 (0:00:01.249) 0:01:12.426 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': 512, u'encryption_key': None, u'encryption_luks_version': u'luks1', u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'4g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': u'serpent-xts-plain64'}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove and recreate existing pool 'foo' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:05:31 +0000 (0:00:00.051) 0:01:12.478 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:194 Wednesday 01 June 2022 17:05:31 +0000 (0:00:00.718) 0:01:13.196 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:200 Wednesday 01 June 2022 17:05:31 +0000 (0:00:00.037) 0:01:13.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:05:31 +0000 (0:00:00.037) 0:01:13.271 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103127.2981215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103127.2981215, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103127.2981215, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1319292977", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:05:32 +0000 (0:00:00.392) 0:01:13.663 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the pool] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:211 Wednesday 01 June 2022 17:05:32 +0000 (0:00:00.038) 0:01:13.702 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:05:32 +0000 (0:00:00.058) 0:01:13.760 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:05:32 +0000 (0:00:00.045) 0:01:13.806 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:05:32 +0000 (0:00:00.534) 0:01:14.341 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:05:32 +0000 (0:00:00.071) 0:01:14.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.032) 0:01:14.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.037) 0:01:14.482 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.067) 0:01:14.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.026) 0:01:14.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.031) 0:01:14.607 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.037) 0:01:14.645 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.034) 0:01:14.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.032) 0:01:14.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.030) 0:01:14.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.029) 0:01:14.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.030) 0:01:14.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:05:33 +0000 (0:00:00.045) 0:01:14.848 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:05:34 +0000 (0:00:00.669) 0:01:15.518 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:05:42 +0000 (0:00:08.498) 0:01:24.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:05:42 +0000 (0:00:00.031) 0:01:24.048 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:05:43 +0000 (0:00:00.726) 0:01:24.774 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:05:43 +0000 (0:00:00.043) 0:01:24.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:05:43 +0000 (0:00:00.036) 0:01:24.854 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:05:43 +0000 (0:00:00.036) 0:01:24.890 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:05:43 +0000 (0:00:00.396) 0:01:25.287 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:05:44 +0000 (0:00:00.667) 0:01:25.955 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:05:44 +0000 (0:00:00.423) 0:01:26.378 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:05:45 +0000 (0:00:00.684) 0:01:27.062 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103120.6151216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103118.4521215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792394, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103118.4501214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1062648471", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:05:46 +0000 (0:00:00.383) 0:01:27.446 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:05:46 +0000 (0:00:00.380) 0:01:27.827 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:230 Wednesday 01 June 2022 17:05:47 +0000 (0:00:00.872) 0:01:28.699 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:05:47 +0000 (0:00:00.052) 0:01:28.752 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:05:47 +0000 (0:00:00.041) 0:01:28.793 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:05:47 +0000 (0:00:00.030) 0:01:28.823 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "19cb1b46-b01b-479d-97dc-0615a909a41f" }, "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7": { "fstype": "LVM2_member", "label": "", "name": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "size": "10G", "type": "crypt", "uuid": "0yCOuq-oNxN-iaE4-tY81-nYZt-FovW-yY1AE3" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e06afd73-5775-4a8f-9f6b-d549bfae82c7" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:05:47 +0000 (0:00:00.378) 0:01:29.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002469", "end": "2022-06-01 13:05:47.526378", "rc": 0, "start": "2022-06-01 13:05:47.523909" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:05:48 +0000 (0:00:00.377) 0:01:29.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002882", "end": "2022-06-01 13:05:47.902564", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:05:47.899682" } STDOUT: luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:05:48 +0000 (0:00:00.377) 0:01:29.956 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:05:48 +0000 (0:00:00.065) 0:01:30.022 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:05:48 +0000 (0:00:00.030) 0:01:30.052 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:05:48 +0000 (0:00:00.104) 0:01:30.157 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:05:48 +0000 (0:00:00.040) 0:01:30.197 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "pv": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.389) 0:01:30.587 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.042) 0:01:30.629 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.038) 0:01:30.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "crypt" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.036) 0:01:30.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.030) 0:01:30.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.029) 0:01:30.764 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.043) 0:01:30.807 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.058) 0:01:30.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.034) 0:01:30.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.036) 0:01:30.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.032) 0:01:30.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.031) 0:01:31.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.031) 0:01:31.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.033) 0:01:31.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.030) 0:01:31.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.031) 0:01:31.128 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.058) 0:01:31.186 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.062) 0:01:31.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.030) 0:01:31.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.031) 0:01:31.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.031) 0:01:31.342 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:05:49 +0000 (0:00:00.064) 0:01:31.407 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:05:50 +0000 (0:00:00.039) 0:01:31.446 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml for /cache/rhel-x.qcow2 TASK [Get the backing device path] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:1 Wednesday 01 June 2022 17:05:50 +0000 (0:00:00.060) 0:01:31.507 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/disk/by-uuid/e06afd73-5775-4a8f-9f6b-d549bfae82c7" ], "delta": "0:00:00.002490", "end": "2022-06-01 13:05:49.825482", "rc": 0, "start": "2022-06-01 13:05:49.822992" } STDOUT: /dev/sda TASK [Collect LUKS info for this member] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:6 Wednesday 01 June 2022 17:05:50 +0000 (0:00:00.372) 0:01:31.879 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012642", "end": "2022-06-01 13:05:50.223811", "rc": 0, "start": "2022-06-01 13:05:50.211169" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: a2 ac b5 c1 c3 e2 e8 b3 db c2 6d 27 eb bf fc 31 51 b7 68 82 MK salt: 7f 82 ac 07 81 78 ad 84 f6 cc 05 a5 2c 38 21 bc cd 4e cb da a4 ab fc 92 90 64 cd 2b b7 d4 b1 5d MK iterations: 97090 UUID: e06afd73-5775-4a8f-9f6b-d549bfae82c7 Key Slot 0: ENABLED Iterations: 1553444 Salt: 4c f9 f7 2e 80 bf 65 2b f0 7b c8 de b9 98 db f9 14 81 b3 81 2a a1 c5 96 0b 26 f7 ec a0 a8 21 f9 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:12 Wednesday 01 June 2022 17:05:50 +0000 (0:00:00.403) 0:01:32.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:18 Wednesday 01 June 2022 17:05:50 +0000 (0:00:00.039) 0:01:32.322 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:24 Wednesday 01 June 2022 17:05:50 +0000 (0:00:00.040) 0:01:32.362 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:05:50 +0000 (0:00:00.038) 0:01:32.401 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.063) 0:01:32.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 /dev/sda -" ] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.036) 0:01:32.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.036) 0:01:32.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.041) 0:01:32.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.030) 0:01:32.610 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.037) 0:01:32.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.031) 0:01:32.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.031) 0:01:32.711 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.107) 0:01:32.819 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.065) 0:01:32.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.032) 0:01:32.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.033) 0:01:32.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.031) 0:01:32.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.030) 0:01:33.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.031) 0:01:33.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.031) 0:01:33.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.030) 0:01:33.105 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.035) 0:01:33.141 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.031) 0:01:33.173 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.061) 0:01:33.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.035) 0:01:33.270 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:05:51 +0000 (0:00:00.125) 0:01:33.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.038) 0:01:33.434 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "19cb1b46-b01b-479d-97dc-0615a909a41f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "19cb1b46-b01b-479d-97dc-0615a909a41f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.043) 0:01:33.477 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.041) 0:01:33.519 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.036) 0:01:33.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.037) 0:01:33.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.030) 0:01:33.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.034) 0:01:33.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.034) 0:01:33.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.029) 0:01:33.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.044) 0:01:33.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.032) 0:01:33.800 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.038) 0:01:33.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.030) 0:01:33.868 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.031) 0:01:33.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.038) 0:01:33.937 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.045) 0:01:33.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103141.8761215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103141.8761215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 13703, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103141.8761215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:05:52 +0000 (0:00:00.417) 0:01:34.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.038) 0:01:34.439 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.038) 0:01:34.478 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.035) 0:01:34.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.031) 0:01:34.545 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.036) 0:01:34.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.028) 0:01:34.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.028) 0:01:34.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.032) 0:01:34.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.041) 0:01:34.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.033) 0:01:34.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.029) 0:01:34.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.031) 0:01:34.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.030) 0:01:34.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.030) 0:01:34.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.037) 0:01:34.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.034) 0:01:34.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.029) 0:01:34.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.031) 0:01:35.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.030) 0:01:35.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.098) 0:01:35.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.033) 0:01:35.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.031) 0:01:35.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.031) 0:01:35.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.030) 0:01:35.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.030) 0:01:35.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.033) 0:01:35.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:05:53 +0000 (0:00:00.031) 0:01:35.354 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.379) 0:01:35.734 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.377) 0:01:36.111 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.040) 0:01:36.151 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.036) 0:01:36.188 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.033) 0:01:36.221 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.032) 0:01:36.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.032) 0:01:36.285 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.034) 0:01:36.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.032) 0:01:36.353 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.035) 0:01:36.388 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:05:54 +0000 (0:00:00.034) 0:01:36.423 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.041) 0:01:36.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037977", "end": "2022-06-01 13:05:54.826464", "rc": 0, "start": "2022-06-01 13:05:54.788487" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.417) 0:01:36.881 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.040) 0:01:36.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.040) 0:01:36.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.033) 0:01:36.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.034) 0:01:37.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.034) 0:01:37.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.034) 0:01:37.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.031) 0:01:37.129 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.036) 0:01:37.166 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.029) 0:01:37.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:05:55 +0000 (0:00:00.030) 0:01:37.226 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Change the mountpoint, leaving encryption in place] ********************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:234 Wednesday 01 June 2022 17:05:56 +0000 (0:00:00.386) 0:01:37.613 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:05:56 +0000 (0:00:00.064) 0:01:37.678 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:05:56 +0000 (0:00:00.047) 0:01:37.725 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:05:56 +0000 (0:00:00.538) 0:01:38.264 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:05:56 +0000 (0:00:00.073) 0:01:38.337 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:05:56 +0000 (0:00:00.032) 0:01:38.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.074) 0:01:38.444 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.064) 0:01:38.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.027) 0:01:38.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.033) 0:01:38.569 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test2", "name": "test1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.037) 0:01:38.607 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.033) 0:01:38.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.030) 0:01:38.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.032) 0:01:38.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.031) 0:01:38.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.032) 0:01:38.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:05:57 +0000 (0:00:00.045) 0:01:38.812 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:05:58 +0000 (0:00:00.694) 0:01:39.507 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:05:59 +0000 (0:00:01.365) 0:01:40.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:05:59 +0000 (0:00:00.033) 0:01:40.906 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:06:00 +0000 (0:00:00.705) 0:01:41.611 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "path": "/opt/test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:06:00 +0000 (0:00:00.041) 0:01:41.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:06:00 +0000 (0:00:00.039) 0:01:41.691 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:06:00 +0000 (0:00:00.038) 0:01:41.730 ******** changed: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test1', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:06:00 +0000 (0:00:00.416) 0:01:42.146 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:06:01 +0000 (0:00:00.657) 0:01:42.804 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:06:01 +0000 (0:00:00.406) 0:01:43.210 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:06:02 +0000 (0:00:00.637) 0:01:43.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103147.9021215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4896da07d32fbe600f739aa566c21e84a8e9e12f", "ctime": 1654103145.7631216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417711, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103145.7621214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1372347311", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:06:02 +0000 (0:00:00.450) 0:01:44.298 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:06:02 +0000 (0:00:00.034) 0:01:44.333 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert to implicitly preserve encryption on existing pool] *************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:246 Wednesday 01 June 2022 17:06:03 +0000 (0:00:00.856) 0:01:45.189 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:06:03 +0000 (0:00:00.036) 0:01:45.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103155.5561216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103155.5561216, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103155.5561216, "nlink": 1, "path": "/opt/test2/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "653236371", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:06:04 +0000 (0:00:00.383) 0:01:45.609 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:255 Wednesday 01 June 2022 17:06:04 +0000 (0:00:00.036) 0:01:45.645 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:06:04 +0000 (0:00:00.053) 0:01:45.699 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:06:04 +0000 (0:00:00.037) 0:01:45.737 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:06:04 +0000 (0:00:00.030) 0:01:45.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "19cb1b46-b01b-479d-97dc-0615a909a41f" }, "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7": { "fstype": "LVM2_member", "label": "", "name": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "size": "10G", "type": "crypt", "uuid": "0yCOuq-oNxN-iaE4-tY81-nYZt-FovW-yY1AE3" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e06afd73-5775-4a8f-9f6b-d549bfae82c7" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:06:04 +0000 (0:00:00.375) 0:01:46.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002474", "end": "2022-06-01 13:06:04.453420", "rc": 0, "start": "2022-06-01 13:06:04.450946" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:06:05 +0000 (0:00:00.368) 0:01:46.511 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002414", "end": "2022-06-01 13:06:04.846556", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:06:04.844142" } STDOUT: luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:06:05 +0000 (0:00:00.387) 0:01:46.899 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:06:05 +0000 (0:00:00.065) 0:01:46.964 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:06:05 +0000 (0:00:00.030) 0:01:46.995 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:06:05 +0000 (0:00:00.063) 0:01:47.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:06:05 +0000 (0:00:00.041) 0:01:47.100 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "pv": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.367) 0:01:47.467 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.043) 0:01:47.511 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.039) 0:01:47.550 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "crypt" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.035) 0:01:47.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.030) 0:01:47.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.030) 0:01:47.647 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.044) 0:01:47.691 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.112) 0:01:47.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.032) 0:01:47.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.030) 0:01:47.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.031) 0:01:47.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.032) 0:01:47.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.031) 0:01:47.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.032) 0:01:47.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.032) 0:01:48.028 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.036) 0:01:48.065 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.062) 0:01:48.127 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.062) 0:01:48.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.032) 0:01:48.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.033) 0:01:48.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.031) 0:01:48.287 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.062) 0:01:48.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:06:06 +0000 (0:00:00.035) 0:01:48.385 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml for /cache/rhel-x.qcow2 TASK [Get the backing device path] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:1 Wednesday 01 June 2022 17:06:07 +0000 (0:00:00.060) 0:01:48.445 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/disk/by-uuid/e06afd73-5775-4a8f-9f6b-d549bfae82c7" ], "delta": "0:00:00.003438", "end": "2022-06-01 13:06:06.780871", "rc": 0, "start": "2022-06-01 13:06:06.777433" } STDOUT: /dev/sda TASK [Collect LUKS info for this member] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:6 Wednesday 01 June 2022 17:06:07 +0000 (0:00:00.391) 0:01:48.837 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011111", "end": "2022-06-01 13:06:07.174337", "rc": 0, "start": "2022-06-01 13:06:07.163226" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: a2 ac b5 c1 c3 e2 e8 b3 db c2 6d 27 eb bf fc 31 51 b7 68 82 MK salt: 7f 82 ac 07 81 78 ad 84 f6 cc 05 a5 2c 38 21 bc cd 4e cb da a4 ab fc 92 90 64 cd 2b b7 d4 b1 5d MK iterations: 97090 UUID: e06afd73-5775-4a8f-9f6b-d549bfae82c7 Key Slot 0: ENABLED Iterations: 1553444 Salt: 4c f9 f7 2e 80 bf 65 2b f0 7b c8 de b9 98 db f9 14 81 b3 81 2a a1 c5 96 0b 26 f7 ec a0 a8 21 f9 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:12 Wednesday 01 June 2022 17:06:07 +0000 (0:00:00.394) 0:01:49.232 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:18 Wednesday 01 June 2022 17:06:07 +0000 (0:00:00.041) 0:01:49.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-encryption.yml:24 Wednesday 01 June 2022 17:06:07 +0000 (0:00:00.039) 0:01:49.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:06:07 +0000 (0:00:00.033) 0:01:49.346 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:06:07 +0000 (0:00:00.066) 0:01:49.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 /dev/sda -" ] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.036) 0:01:49.448 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.037) 0:01:49.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.038) 0:01:49.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.030) 0:01:49.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.038) 0:01:49.593 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.034) 0:01:49.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.031) 0:01:49.659 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.064) 0:01:49.723 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.066) 0:01:49.789 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.030) 0:01:49.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.031) 0:01:49.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.031) 0:01:49.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.030) 0:01:49.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.031) 0:01:49.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.029) 0:01:49.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.030) 0:01:50.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.030) 0:01:50.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.030) 0:01:50.065 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.098) 0:01:50.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.035) 0:01:50.199 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.122) 0:01:50.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.034) 0:01:50.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097148, "inode_total": 2097152, "inode_used": 4, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "19cb1b46-b01b-479d-97dc-0615a909a41f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097148, "inode_total": 2097152, "inode_used": 4, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "19cb1b46-b01b-479d-97dc-0615a909a41f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:06:08 +0000 (0:00:00.045) 0:01:50.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.041) 0:01:50.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.036) 0:01:50.479 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.038) 0:01:50.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.031) 0:01:50.550 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.031) 0:01:50.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.033) 0:01:50.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.041) 0:01:50.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.048) 0:01:50.706 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.035) 0:01:50.741 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.036) 0:01:50.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.030) 0:01:50.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.034) 0:01:50.843 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.037) 0:01:50.880 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.037) 0:01:50.918 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103141.8761215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103141.8761215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 13703, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103141.8761215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.364) 0:01:51.282 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.039) 0:01:51.322 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.040) 0:01:51.362 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:06:09 +0000 (0:00:00.033) 0:01:51.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.032) 0:01:51.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.039) 0:01:51.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.031) 0:01:51.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.030) 0:01:51.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.030) 0:01:51.561 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.037) 0:01:51.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.030) 0:01:51.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.033) 0:01:51.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.030) 0:01:51.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.031) 0:01:51.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.031) 0:01:51.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.037) 0:01:51.793 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.034) 0:01:51.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.031) 0:01:51.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.030) 0:01:51.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.030) 0:01:51.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.031) 0:01:51.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.033) 0:01:51.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.032) 0:01:52.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.035) 0:01:52.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.032) 0:01:52.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.033) 0:01:52.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.033) 0:01:52.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:06:10 +0000 (0:00:00.031) 0:01:52.183 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.403) 0:01:52.586 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.378) 0:01:52.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.037) 0:01:53.002 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.032) 0:01:53.035 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.037) 0:01:53.072 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.033) 0:01:53.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.032) 0:01:53.138 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.031) 0:01:53.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.032) 0:01:53.202 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.035) 0:01:53.238 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.036) 0:01:53.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:06:11 +0000 (0:00:00.039) 0:01:53.314 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037004", "end": "2022-06-01 13:06:11.671469", "rc": 0, "start": "2022-06-01 13:06:11.634465" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.413) 0:01:53.727 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.040) 0:01:53.768 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.039) 0:01:53.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.034) 0:01:53.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.032) 0:01:53.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.036) 0:01:53.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.034) 0:01:53.946 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.036) 0:01:53.983 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.034) 0:01:54.017 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.029) 0:01:54.047 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:257 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.036) 0:01:54.084 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.074) 0:01:54.158 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:06:12 +0000 (0:00:00.046) 0:01:54.204 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.555) 0:01:54.760 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.078) 0:01:54.838 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.033) 0:01:54.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.032) 0:01:54.904 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.066) 0:01:54.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.026) 0:01:54.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.032) 0:01:55.030 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.035) 0:01:55.065 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.037) 0:01:55.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.031) 0:01:55.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.031) 0:01:55.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.031) 0:01:55.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.031) 0:01:55.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:06:13 +0000 (0:00:00.050) 0:01:55.279 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:06:14 +0000 (0:00:00.688) 0:01:55.968 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_mount_id": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:06:16 +0000 (0:00:02.021) 0:01:57.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:06:16 +0000 (0:00:00.034) 0:01:58.024 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dbd5f7f20\x2dd665\x2d4abc\x2d9bd1\x2ddd0c981eaf5f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "name": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbd5f7f20\\x2dd665\\x2d4abc\\x2d9bd1\\x2ddd0c981eaf5f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dbd5f7f20\\\\x2dd665\\\\x2d4abc\\\\x2d9bd1\\\\x2ddd0c981eaf5f.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:06:17 +0000 (0:00:00.770) 0:01:58.794 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_mount_id": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:06:17 +0000 (0:00:00.043) 0:01:58.838 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:06:17 +0000 (0:00:00.036) 0:01:58.874 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_mount_id": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:06:17 +0000 (0:00:00.038) 0:01:58.913 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:06:17 +0000 (0:00:00.416) 0:01:59.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:06:18 +0000 (0:00:00.641) 0:01:59.971 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:06:18 +0000 (0:00:00.030) 0:02:00.002 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:06:19 +0000 (0:00:00.646) 0:02:00.648 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103147.9021215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4896da07d32fbe600f739aa566c21e84a8e9e12f", "ctime": 1654103145.7631216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417711, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103145.7621214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1372347311", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:06:19 +0000 (0:00:00.378) 0:02:01.027 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:06:19 +0000 (0:00:00.387) 0:02:01.414 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks_pool.yml:267 Wednesday 01 June 2022 17:06:20 +0000 (0:00:00.905) 0:02:02.320 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:06:20 +0000 (0:00:00.059) 0:02:02.379 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:06:20 +0000 (0:00:00.029) 0:02:02.409 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_mount_id": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:06:21 +0000 (0:00:00.037) 0:02:02.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:06:21 +0000 (0:00:00.376) 0:02:02.823 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003574", "end": "2022-06-01 13:06:21.165454", "rc": 0, "start": "2022-06-01 13:06:21.161880" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:06:21 +0000 (0:00:00.403) 0:02:03.227 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.004920", "end": "2022-06-01 13:06:21.561231", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:06:21.556311" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.393) 0:02:03.620 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.027) 0:02:03.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.029) 0:02:03.676 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.147) 0:02:03.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.040) 0:02:03.865 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.138) 0:02:04.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.037) 0:02:04.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.043) 0:02:04.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.034) 0:02:04.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.036) 0:02:04.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.031) 0:02:04.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.030) 0:02:04.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.029) 0:02:04.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.029) 0:02:04.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.040) 0:02:04.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.049) 0:02:04.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:06:22 +0000 (0:00:00.026) 0:02:04.393 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.035) 0:02:04.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.030) 0:02:04.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.031) 0:02:04.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.034) 0:02:04.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.025) 0:02:04.550 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103175.8011215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103175.8011215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103175.8011215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.374) 0:02:04.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.038) 0:02:04.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.026) 0:02:04.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.032) 0:02:05.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.029) 0:02:05.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:06:23 +0000 (0:00:00.026) 0:02:05.079 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.376) 0:02:05.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.032) 0:02:05.488 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.036) 0:02:05.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.026) 0:02:05.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.030) 0:02:05.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.029) 0:02:05.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.028) 0:02:05.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.031) 0:02:05.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.029) 0:02:05.701 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.040) 0:02:05.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.041) 0:02:05.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.031) 0:02:05.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.030) 0:02:05.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.033) 0:02:05.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.030) 0:02:05.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.031) 0:02:05.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.031) 0:02:05.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.030) 0:02:06.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.030) 0:02:06.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.032) 0:02:06.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.034) 0:02:06.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.031) 0:02:06.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.033) 0:02:06.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.029) 0:02:06.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.030) 0:02:06.226 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.038) 0:02:06.265 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.077) 0:02:06.342 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.030) 0:02:06.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:06:24 +0000 (0:00:00.029) 0:02:06.402 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.029) 0:02:06.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.029) 0:02:06.461 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.032) 0:02:06.494 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.033) 0:02:06.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.029) 0:02:06.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.030) 0:02:06.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.030) 0:02:06.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.030) 0:02:06.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.033) 0:02:06.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.033) 0:02:06.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.030) 0:02:06.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.029) 0:02:06.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.029) 0:02:06.804 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=546 changed=40 unreachable=0 failed=3 skipped=337 rescued=3 ignored=0 Wednesday 01 June 2022 17:06:25 +0000 (0:00:00.015) 0:02:06.820 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state -- 10.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.50s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.31s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.25s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_luks_pool_scsi_generated.yml:3 ------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : get required packages ---------------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Mask the systemd cryptsetup services ------- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:06:26 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:06:27 +0000 (0:00:01.266) 0:00:01.288 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml statically imported: /tmp/tmp7247_7fr/tests/create-test-file.yml statically imported: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml PLAYBOOK: tests_luks_scsi_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_luks_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_scsi_generated.yml:3 Wednesday 01 June 2022 17:06:27 +0000 (0:00:00.069) 0:00:01.358 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks_scsi_generated.yml:7 Wednesday 01 June 2022 17:06:28 +0000 (0:00:01.065) 0:00:02.424 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:2 Wednesday 01 June 2022 17:06:28 +0000 (0:00:00.033) 0:00:02.457 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:11 Wednesday 01 June 2022 17:06:29 +0000 (0:00:00.806) 0:00:03.264 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:06:29 +0000 (0:00:00.037) 0:00:03.301 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:06:29 +0000 (0:00:00.158) 0:00:03.460 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:06:30 +0000 (0:00:00.522) 0:00:03.983 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:06:30 +0000 (0:00:00.074) 0:00:04.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:06:30 +0000 (0:00:00.020) 0:00:04.078 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:06:30 +0000 (0:00:00.020) 0:00:04.099 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:06:30 +0000 (0:00:00.185) 0:00:04.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:06:30 +0000 (0:00:00.017) 0:00:04.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:06:31 +0000 (0:00:01.056) 0:00:05.358 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:06:31 +0000 (0:00:00.046) 0:00:05.405 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:06:31 +0000 (0:00:00.046) 0:00:05.451 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:06:32 +0000 (0:00:00.693) 0:00:06.145 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:06:32 +0000 (0:00:00.081) 0:00:06.227 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:06:32 +0000 (0:00:00.051) 0:00:06.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:06:32 +0000 (0:00:00.023) 0:00:06.301 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:06:32 +0000 (0:00:00.021) 0:00:06.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:06:33 +0000 (0:00:00.854) 0:00:07.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service": { "name": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:06:35 +0000 (0:00:01.769) 0:00:08.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:06:35 +0000 (0:00:00.041) 0:00:08.989 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2de06afd73\x2d5775\x2d4a8f\x2d9f6b\x2dd549bfae82c7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "name": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2de06afd73\\\\x2d5775\\\\x2d4a8f\\\\x2d9f6b\\\\x2dd549bfae82c7.target\" umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e06afd73-5775-4a8f-9f6b-d549bfae82c7 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de06afd73\\\\x2d5775\\\\x2d4a8f\\\\x2d9f6b\\\\x2dd549bfae82c7.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:05:59 EDT", "StateChangeTimestampMonotonic": "2748550771", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2de06afd73\\\\x2d5775\\\\x2d4a8f\\\\x2d9f6b\\\\x2dd549bfae82c7.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:06:36 +0000 (0:00:00.944) 0:00:09.933 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:06:36 +0000 (0:00:00.527) 0:00:10.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:06:36 +0000 (0:00:00.028) 0:00:10.490 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2de06afd73\x2d5775\x2d4a8f\x2d9f6b\x2dd549bfae82c7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "name": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2de06afd73\\x2d5775\\x2d4a8f\\x2d9f6b\\x2dd549bfae82c7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de06afd73\\\\x2d5775\\\\x2d4a8f\\\\x2d9f6b\\\\x2dd549bfae82c7.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.706) 0:00:11.196 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.033) 0:00:11.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.033) 0:00:11.263 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.032) 0:00:11.296 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.026) 0:00:11.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.026) 0:00:11.348 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.027) 0:00:11.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:06:37 +0000 (0:00:00.027) 0:00:11.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103181.5601215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103179.3471215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792397, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103179.3461215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "620317175", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:06:38 +0000 (0:00:00.543) 0:00:11.947 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:06:38 +0000 (0:00:00.029) 0:00:11.976 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:14 Wednesday 01 June 2022 17:06:38 +0000 (0:00:00.833) 0:00:12.810 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:06:38 +0000 (0:00:00.044) 0:00:12.855 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:06:39 +0000 (0:00:00.529) 0:00:13.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:06:39 +0000 (0:00:00.066) 0:00:13.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:06:39 +0000 (0:00:00.030) 0:00:13.482 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:24 Wednesday 01 June 2022 17:06:39 +0000 (0:00:00.032) 0:00:13.514 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:06:39 +0000 (0:00:00.046) 0:00:13.561 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:06:39 +0000 (0:00:00.043) 0:00:13.604 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:06:40 +0000 (0:00:00.501) 0:00:14.106 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:06:40 +0000 (0:00:00.069) 0:00:14.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:06:40 +0000 (0:00:00.029) 0:00:14.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:06:40 +0000 (0:00:00.030) 0:00:14.236 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:06:40 +0000 (0:00:00.061) 0:00:14.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:06:40 +0000 (0:00:00.025) 0:00:14.322 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:06:41 +0000 (0:00:00.858) 0:00:15.180 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:06:41 +0000 (0:00:00.033) 0:00:15.214 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:06:41 +0000 (0:00:00.035) 0:00:15.249 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:06:42 +0000 (0:00:01.047) 0:00:16.297 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:06:42 +0000 (0:00:00.054) 0:00:16.352 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:06:42 +0000 (0:00:00.028) 0:00:16.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:06:42 +0000 (0:00:00.043) 0:00:16.424 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:06:42 +0000 (0:00:00.030) 0:00:16.455 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:06:43 +0000 (0:00:00.861) 0:00:17.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:06:45 +0000 (0:00:01.697) 0:00:19.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:06:45 +0000 (0:00:00.045) 0:00:19.059 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:06:45 +0000 (0:00:00.029) 0:00:19.089 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:06:46 +0000 (0:00:01.011) 0:00:20.100 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:06:46 +0000 (0:00:00.037) 0:00:20.138 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:40 Wednesday 01 June 2022 17:06:46 +0000 (0:00:00.025) 0:00:20.163 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:46 Wednesday 01 June 2022 17:06:46 +0000 (0:00:00.030) 0:00:20.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:54 Wednesday 01 June 2022 17:06:46 +0000 (0:00:00.032) 0:00:20.226 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:06:46 +0000 (0:00:00.045) 0:00:20.271 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:06:46 +0000 (0:00:00.041) 0:00:20.312 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:06:46 +0000 (0:00:00.513) 0:00:20.825 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:06:47 +0000 (0:00:00.071) 0:00:20.897 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:06:47 +0000 (0:00:00.030) 0:00:20.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:06:47 +0000 (0:00:00.029) 0:00:20.957 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:06:47 +0000 (0:00:00.064) 0:00:21.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:06:47 +0000 (0:00:00.025) 0:00:21.047 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:06:47 +0000 (0:00:00.811) 0:00:21.858 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:06:48 +0000 (0:00:00.031) 0:00:21.889 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:06:48 +0000 (0:00:00.033) 0:00:21.923 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:06:49 +0000 (0:00:00.952) 0:00:22.875 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:06:49 +0000 (0:00:00.090) 0:00:22.965 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:06:49 +0000 (0:00:00.027) 0:00:22.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:06:49 +0000 (0:00:00.029) 0:00:23.022 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:06:49 +0000 (0:00:00.029) 0:00:23.052 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:06:49 +0000 (0:00:00.792) 0:00:23.844 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:06:51 +0000 (0:00:01.626) 0:00:25.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:06:51 +0000 (0:00:00.052) 0:00:25.524 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:06:51 +0000 (0:00:00.027) 0:00:25.551 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:07:03 +0000 (0:00:11.634) 0:00:37.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:07:03 +0000 (0:00:00.030) 0:00:37.217 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:07:03 +0000 (0:00:00.029) 0:00:37.246 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:07:03 +0000 (0:00:00.038) 0:00:37.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:07:03 +0000 (0:00:00.031) 0:00:37.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:07:03 +0000 (0:00:00.036) 0:00:37.353 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:07:03 +0000 (0:00:00.028) 0:00:37.381 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:07:04 +0000 (0:00:00.702) 0:00:38.084 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:07:04 +0000 (0:00:00.567) 0:00:38.652 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:07:05 +0000 (0:00:00.664) 0:00:39.317 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103181.5601215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103179.3471215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792397, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103179.3461215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "620317175", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:07:05 +0000 (0:00:00.383) 0:00:39.700 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-f2f2d083-3c31-4478-9173-db921c2b0e6b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:07:06 +0000 (0:00:00.527) 0:00:40.228 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:66 Wednesday 01 June 2022 17:07:07 +0000 (0:00:00.858) 0:00:41.087 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:07:07 +0000 (0:00:00.048) 0:00:41.136 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:07:07 +0000 (0:00:00.030) 0:00:41.166 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:07:07 +0000 (0:00:00.039) 0:00:41.205 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "size": "10G", "type": "crypt", "uuid": "4dc72c63-0a9d-4010-a5cd-dcea8967ac15" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2f2d083-3c31-4478-9173-db921c2b0e6b" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:07:07 +0000 (0:00:00.527) 0:00:41.733 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002688", "end": "2022-06-01 13:07:07.699194", "rc": 0, "start": "2022-06-01 13:07:07.696506" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:07:08 +0000 (0:00:00.473) 0:00:42.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003332", "end": "2022-06-01 13:07:08.079862", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:07:08.076530" } STDOUT: luks-f2f2d083-3c31-4478-9173-db921c2b0e6b /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:07:08 +0000 (0:00:00.380) 0:00:42.588 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:07:08 +0000 (0:00:00.027) 0:00:42.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:07:08 +0000 (0:00:00.029) 0:00:42.644 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:07:08 +0000 (0:00:00.057) 0:00:42.702 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:07:08 +0000 (0:00:00.035) 0:00:42.738 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:07:08 +0000 (0:00:00.123) 0:00:42.861 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.035) 0:00:42.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "4dc72c63-0a9d-4010-a5cd-dcea8967ac15" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "4dc72c63-0a9d-4010-a5cd-dcea8967ac15" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.042) 0:00:42.938 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.036) 0:00:42.974 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.036) 0:00:43.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.035) 0:00:43.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.028) 0:00:43.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.029) 0:00:43.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.028) 0:00:43.133 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.028) 0:00:43.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.046) 0:00:43.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.033) 0:00:43.241 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.035) 0:00:43.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.029) 0:00:43.305 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.029) 0:00:43.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.035) 0:00:43.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.036) 0:00:43.406 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103222.4621215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103222.4621215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103222.4621215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.371) 0:00:43.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.037) 0:00:43.814 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:07:09 +0000 (0:00:00.034) 0:00:43.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.032) 0:00:43.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.029) 0:00:43.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.033) 0:00:43.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103222.6061215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103222.6061215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 13951, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103222.6061215, "nlink": 1, "path": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.373) 0:00:44.318 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.013241", "end": "2022-06-01 13:07:10.211230", "rc": 0, "start": "2022-06-01 13:07:10.197989" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: f2f2d083-3c31-4478-9173-db921c2b0e6b Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 11 Memory: 906462 Threads: 4 Salt: a1 5b 82 ff d2 fd 59 5b 04 95 09 1b 3b 63 47 fa 2d 42 33 e4 ed 62 d7 86 52 30 ef b7 e6 24 c4 4c AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 96803 Salt: b0 3f d7 c4 b3 66 8c 91 b6 2c 1e 87 63 00 11 b9 9f c6 25 26 8a 17 c6 7f 5b 28 f1 a1 60 57 e2 63 Digest: 07 49 eb 50 bf 08 f2 d0 4c a7 0b b5 27 39 96 f1 4f 91 db f5 fd 4e ed 08 1b c4 f9 73 bf 4e f3 d5 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.394) 0:00:44.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.036) 0:00:44.749 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.039) 0:00:44.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.037) 0:00:44.826 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:07:10 +0000 (0:00:00.035) 0:00:44.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.029) 0:00:44.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.028) 0:00:44.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:44.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f2f2d083-3c31-4478-9173-db921c2b0e6b /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.040) 0:00:44.991 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.035) 0:00:45.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.036) 0:00:45.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.037) 0:00:45.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.036) 0:00:45.138 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.034) 0:00:45.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.031) 0:00:45.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.037) 0:00:45.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.041) 0:00:45.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.032) 0:00:45.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.031) 0:00:45.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.499 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.033) 0:00:45.532 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.032) 0:00:45.564 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.035) 0:00:45.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.033) 0:00:45.633 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.694 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.035) 0:00:45.729 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.032) 0:00:45.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.032) 0:00:45.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:07:11 +0000 (0:00:00.030) 0:00:45.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.030) 0:00:45.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.029) 0:00:45.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.030) 0:00:45.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.030) 0:00:45.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.030) 0:00:46.007 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.030) 0:00:46.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.030) 0:00:46.068 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:72 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.560) 0:00:46.628 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.053) 0:00:46.682 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:07:12 +0000 (0:00:00.045) 0:00:46.727 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:07:13 +0000 (0:00:00.522) 0:00:47.250 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:07:13 +0000 (0:00:00.111) 0:00:47.362 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:07:13 +0000 (0:00:00.031) 0:00:47.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:07:13 +0000 (0:00:00.029) 0:00:47.423 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:07:13 +0000 (0:00:00.065) 0:00:47.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:07:13 +0000 (0:00:00.025) 0:00:47.514 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:07:14 +0000 (0:00:00.880) 0:00:48.395 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:07:14 +0000 (0:00:00.034) 0:00:48.429 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:07:14 +0000 (0:00:00.036) 0:00:48.466 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:07:15 +0000 (0:00:01.081) 0:00:49.547 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:07:15 +0000 (0:00:00.058) 0:00:49.606 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:07:15 +0000 (0:00:00.027) 0:00:49.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:07:15 +0000 (0:00:00.030) 0:00:49.664 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:07:15 +0000 (0:00:00.028) 0:00:49.693 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:07:16 +0000 (0:00:00.825) 0:00:50.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:07:18 +0000 (0:00:01.660) 0:00:52.179 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:07:18 +0000 (0:00:00.047) 0:00:52.227 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:07:18 +0000 (0:00:00.028) 0:00:52.255 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-f2f2d083-3c31-4478-9173-db921c2b0e6b' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:07:19 +0000 (0:00:01.047) 0:00:53.303 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10720641024, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-f2f2d083-3c31-4478-9173-db921c2b0e6b' in safe mode due to encryption removal"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:07:19 +0000 (0:00:00.044) 0:00:53.347 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:87 Wednesday 01 June 2022 17:07:19 +0000 (0:00:00.028) 0:00:53.376 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:93 Wednesday 01 June 2022 17:07:19 +0000 (0:00:00.034) 0:00:53.411 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:07:19 +0000 (0:00:00.036) 0:00:53.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103232.1221216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103232.1221216, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103232.1221216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "196985479", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:07:19 +0000 (0:00:00.392) 0:00:53.839 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:104 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.034) 0:00:53.873 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.049) 0:00:53.923 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.041) 0:00:53.964 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.510) 0:00:54.474 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.072) 0:00:54.547 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.031) 0:00:54.579 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.030) 0:00:54.609 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.062) 0:00:54.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:07:20 +0000 (0:00:00.028) 0:00:54.700 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:07:21 +0000 (0:00:00.843) 0:00:55.543 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:07:21 +0000 (0:00:00.033) 0:00:55.576 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:07:21 +0000 (0:00:00.035) 0:00:55.612 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:07:22 +0000 (0:00:01.076) 0:00:56.689 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:07:22 +0000 (0:00:00.057) 0:00:56.747 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:07:22 +0000 (0:00:00.031) 0:00:56.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:07:22 +0000 (0:00:00.030) 0:00:56.808 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:07:22 +0000 (0:00:00.028) 0:00:56.837 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:07:23 +0000 (0:00:00.835) 0:00:57.672 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:07:25 +0000 (0:00:01.663) 0:00:59.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:07:25 +0000 (0:00:00.045) 0:00:59.381 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:07:25 +0000 (0:00:00.029) 0:00:59.410 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:07:27 +0000 (0:00:01.593) 0:01:01.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:07:27 +0000 (0:00:00.032) 0:01:01.037 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:07:27 +0000 (0:00:00.030) 0:01:01.068 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:07:27 +0000 (0:00:00.039) 0:01:01.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:07:27 +0000 (0:00:00.034) 0:01:01.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:07:27 +0000 (0:00:00.035) 0:01:01.178 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f2f2d083-3c31-4478-9173-db921c2b0e6b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:07:27 +0000 (0:00:00.389) 0:01:01.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:07:28 +0000 (0:00:00.662) 0:01:02.229 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=dc909711-0f3f-4319-bc49-89e7915b8838', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:07:28 +0000 (0:00:00.432) 0:01:02.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:07:29 +0000 (0:00:00.680) 0:01:03.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103228.0791216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "58306297d6c3889d87556238a32fa46bf1dfb573", "ctime": 1654103225.7181215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 17417711, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103225.7161214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "2315926994", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:07:29 +0000 (0:00:00.378) 0:01:03.721 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-f2f2d083-3c31-4478-9173-db921c2b0e6b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:07:30 +0000 (0:00:00.392) 0:01:04.113 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:117 Wednesday 01 June 2022 17:07:31 +0000 (0:00:00.800) 0:01:04.913 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:07:31 +0000 (0:00:00.049) 0:01:04.963 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:07:31 +0000 (0:00:00.030) 0:01:04.993 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10720641024, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:07:31 +0000 (0:00:00.039) 0:01:05.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "dc909711-0f3f-4319-bc49-89e7915b8838" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:07:31 +0000 (0:00:00.370) 0:01:05.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002433", "end": "2022-06-01 13:07:31.271624", "rc": 0, "start": "2022-06-01 13:07:31.269191" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=dc909711-0f3f-4319-bc49-89e7915b8838 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:07:31 +0000 (0:00:00.371) 0:01:05.775 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002717", "end": "2022-06-01 13:07:31.639439", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:07:31.636722" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.369) 0:01:06.144 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.027) 0:01:06.172 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.029) 0:01:06.201 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.060) 0:01:06.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.035) 0:01:06.297 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.120) 0:01:06.417 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.035) 0:01:06.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "dc909711-0f3f-4319-bc49-89e7915b8838" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592358, "block_size": 4096, "block_total": 2618880, "block_used": 26522, "device": "/dev/sda", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10618298368, "size_total": 10726932480, "uuid": "dc909711-0f3f-4319-bc49-89e7915b8838" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.042) 0:01:06.495 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.083) 0:01:06.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.034) 0:01:06.614 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.037) 0:01:06.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.031) 0:01:06.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.030) 0:01:06.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.030) 0:01:06.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.036) 0:01:06.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=dc909711-0f3f-4319-bc49-89e7915b8838 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.043) 0:01:06.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:07:32 +0000 (0:00:00.034) 0:01:06.859 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.033) 0:01:06.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.029) 0:01:06.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.029) 0:01:06.951 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.038) 0:01:06.990 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.037) 0:01:07.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103246.3681216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103246.3681216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103246.3681216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.373) 0:01:07.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.037) 0:01:07.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.035) 0:01:07.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.033) 0:01:07.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.032) 0:01:07.539 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.036) 0:01:07.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.032) 0:01:07.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.029) 0:01:07.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.028) 0:01:07.667 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.035) 0:01:07.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.030) 0:01:07.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.029) 0:01:07.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.031) 0:01:07.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.029) 0:01:07.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:07:33 +0000 (0:00:00.030) 0:01:07.853 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.037) 0:01:07.891 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.035) 0:01:07.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.031) 0:01:07.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.031) 0:01:07.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.031) 0:01:08.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.028) 0:01:08.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.039) 0:01:08.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.040) 0:01:08.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.033) 0:01:08.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.030) 0:01:08.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.030) 0:01:08.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.031) 0:01:08.374 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.032) 0:01:08.407 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.031) 0:01:08.438 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.030) 0:01:08.498 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.031) 0:01:08.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.037) 0:01:08.567 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.036) 0:01:08.603 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.033) 0:01:08.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.028) 0:01:08.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.031) 0:01:08.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.029) 0:01:08.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.030) 0:01:08.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:07:34 +0000 (0:00:00.030) 0:01:08.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:07:35 +0000 (0:00:00.029) 0:01:08.876 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:07:35 +0000 (0:00:00.031) 0:01:08.907 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:07:35 +0000 (0:00:00.032) 0:01:08.940 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:123 Wednesday 01 June 2022 17:07:35 +0000 (0:00:00.387) 0:01:09.328 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:07:35 +0000 (0:00:00.097) 0:01:09.425 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:07:35 +0000 (0:00:00.044) 0:01:09.470 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:07:36 +0000 (0:00:00.503) 0:01:09.973 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:07:36 +0000 (0:00:00.071) 0:01:10.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:07:36 +0000 (0:00:00.031) 0:01:10.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:07:36 +0000 (0:00:00.030) 0:01:10.107 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:07:36 +0000 (0:00:00.063) 0:01:10.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:07:36 +0000 (0:00:00.026) 0:01:10.198 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:07:37 +0000 (0:00:01.070) 0:01:11.268 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:07:37 +0000 (0:00:00.035) 0:01:11.304 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:07:37 +0000 (0:00:00.037) 0:01:11.341 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:07:38 +0000 (0:00:01.029) 0:01:12.370 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:07:38 +0000 (0:00:00.054) 0:01:12.425 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:07:38 +0000 (0:00:00.027) 0:01:12.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:07:38 +0000 (0:00:00.029) 0:01:12.482 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:07:38 +0000 (0:00:00.028) 0:01:12.510 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:07:39 +0000 (0:00:00.850) 0:01:13.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service": { "name": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:07:41 +0000 (0:00:01.703) 0:01:15.065 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:07:41 +0000 (0:00:00.056) 0:01:15.121 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2df2f2d083\x2d3c31\x2d4478\x2d9173\x2ddb921c2b0e6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "name": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2df2f2d083\\\\x2d3c31\\\\x2d4478\\\\x2d9173\\\\x2ddb921c2b0e6b.target\" umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f2f2d083-3c31-4478-9173-db921c2b0e6b", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f2f2d083-3c31-4478-9173-db921c2b0e6b /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f2f2d083-3c31-4478-9173-db921c2b0e6b /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f2f2d083-3c31-4478-9173-db921c2b0e6b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f2f2d083-3c31-4478-9173-db921c2b0e6b ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2df2f2d083\\\\x2d3c31\\\\x2d4478\\\\x2d9173\\\\x2ddb921c2b0e6b.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:07:28 EDT", "StateChangeTimestampMonotonic": "2837843409", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2df2f2d083\\\\x2d3c31\\\\x2d4478\\\\x2d9173\\\\x2ddb921c2b0e6b.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:07:41 +0000 (0:00:00.690) 0:01:15.811 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:07:42 +0000 (0:00:01.040) 0:01:16.852 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:07:43 +0000 (0:00:00.042) 0:01:16.895 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2df2f2d083\x2d3c31\x2d4478\x2d9173\x2ddb921c2b0e6b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "name": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df2f2d083\\x2d3c31\\x2d4478\\x2d9173\\x2ddb921c2b0e6b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2df2f2d083\\\\x2d3c31\\\\x2d4478\\\\x2d9173\\\\x2ddb921c2b0e6b.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:138 Wednesday 01 June 2022 17:07:43 +0000 (0:00:00.748) 0:01:17.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:144 Wednesday 01 June 2022 17:07:43 +0000 (0:00:00.036) 0:01:17.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:07:43 +0000 (0:00:00.037) 0:01:17.718 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103254.8221216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103254.8221216, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103254.8221216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1219387888", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:07:44 +0000 (0:00:00.391) 0:01:18.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:155 Wednesday 01 June 2022 17:07:44 +0000 (0:00:00.036) 0:01:18.146 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:07:44 +0000 (0:00:00.049) 0:01:18.195 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:07:44 +0000 (0:00:00.044) 0:01:18.240 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:07:44 +0000 (0:00:00.531) 0:01:18.772 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:07:44 +0000 (0:00:00.073) 0:01:18.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:07:45 +0000 (0:00:00.030) 0:01:18.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:07:45 +0000 (0:00:00.031) 0:01:18.908 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:07:45 +0000 (0:00:00.062) 0:01:18.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:07:45 +0000 (0:00:00.025) 0:01:18.996 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:07:45 +0000 (0:00:00.873) 0:01:19.869 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:07:46 +0000 (0:00:00.035) 0:01:19.905 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:07:46 +0000 (0:00:00.038) 0:01:19.943 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:07:47 +0000 (0:00:01.022) 0:01:20.966 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:07:47 +0000 (0:00:00.053) 0:01:21.019 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:07:47 +0000 (0:00:00.025) 0:01:21.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:07:47 +0000 (0:00:00.029) 0:01:21.075 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:07:47 +0000 (0:00:00.028) 0:01:21.103 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:07:48 +0000 (0:00:00.830) 0:01:21.934 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:07:49 +0000 (0:00:01.713) 0:01:23.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:07:49 +0000 (0:00:00.046) 0:01:23.694 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:07:49 +0000 (0:00:00.027) 0:01:23.721 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:07:59 +0000 (0:00:09.936) 0:01:33.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:07:59 +0000 (0:00:00.039) 0:01:33.698 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:07:59 +0000 (0:00:00.038) 0:01:33.737 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:07:59 +0000 (0:00:00.044) 0:01:33.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:07:59 +0000 (0:00:00.036) 0:01:33.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:07:59 +0000 (0:00:00.037) 0:01:33.855 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=dc909711-0f3f-4319-bc49-89e7915b8838', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=dc909711-0f3f-4319-bc49-89e7915b8838" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:08:00 +0000 (0:00:00.388) 0:01:34.243 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:08:01 +0000 (0:00:00.698) 0:01:34.941 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:08:01 +0000 (0:00:00.408) 0:01:35.350 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:08:02 +0000 (0:00:00.645) 0:01:35.996 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103251.6381216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103249.5921216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8730271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103249.5911214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1920188261", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:08:02 +0000 (0:00:00.367) 0:01:36.363 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:08:02 +0000 (0:00:00.428) 0:01:36.792 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:168 Wednesday 01 June 2022 17:08:03 +0000 (0:00:00.858) 0:01:37.650 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:08:03 +0000 (0:00:00.050) 0:01:37.701 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:08:03 +0000 (0:00:00.030) 0:01:37.731 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:08:03 +0000 (0:00:00.038) 0:01:37.769 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "size": "10G", "type": "crypt", "uuid": "97b17965-ec48-48f4-acc1-5eea3d0ae6da" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:08:04 +0000 (0:00:00.369) 0:01:38.138 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002747", "end": "2022-06-01 13:08:03.995825", "rc": 0, "start": "2022-06-01 13:08:03.993078" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:08:04 +0000 (0:00:00.363) 0:01:38.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002624", "end": "2022-06-01 13:08:04.359986", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:08:04.357362" } STDOUT: luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:08:04 +0000 (0:00:00.363) 0:01:38.865 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.027) 0:01:38.893 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.030) 0:01:38.923 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.060) 0:01:38.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.032) 0:01:39.017 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.113) 0:01:39.130 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.042) 0:01:39.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "97b17965-ec48-48f4-acc1-5eea3d0ae6da" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588290, "block_size": 4096, "block_total": 2614784, "block_used": 26494, "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fstype": "xfs", "inode_available": 5234685, "inode_total": 5234688, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10601635840, "size_total": 10710155264, "uuid": "97b17965-ec48-48f4-acc1-5eea3d0ae6da" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.045) 0:01:39.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.092) 0:01:39.311 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.034) 0:01:39.345 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.039) 0:01:39.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.028) 0:01:39.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.036) 0:01:39.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.033) 0:01:39.483 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.038) 0:01:39.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.049) 0:01:39.571 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.034) 0:01:39.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.035) 0:01:39.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.028) 0:01:39.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.034) 0:01:39.704 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.038) 0:01:39.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:08:05 +0000 (0:00:00.039) 0:01:39.781 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103278.9131215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103278.9131215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103278.9131215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:08:06 +0000 (0:00:00.403) 0:01:40.185 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:08:06 +0000 (0:00:00.037) 0:01:40.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:08:06 +0000 (0:00:00.034) 0:01:40.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:08:06 +0000 (0:00:00.035) 0:01:40.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:08:06 +0000 (0:00:00.032) 0:01:40.324 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:08:06 +0000 (0:00:00.035) 0:01:40.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103279.0611215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103279.0611215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 14244, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103279.0611215, "nlink": 1, "path": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:08:06 +0000 (0:00:00.393) 0:01:40.753 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012521", "end": "2022-06-01 13:08:06.630427", "rc": 0, "start": "2022-06-01 13:08:06.617906" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 8 Memory: 906462 Threads: 4 Salt: bc 3f 06 69 ef 20 d2 c7 28 a2 1e b8 95 71 cd 6b 9f 29 e3 ff 23 19 fd 12 b4 26 39 55 f5 f5 70 91 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 98107 Salt: 5a 1b 78 8a 33 86 50 ed 82 b9 00 ae 0a 0b 23 3f 63 90 ab d3 6f c0 77 f8 2c de 05 c8 61 30 9a 55 Digest: 77 08 85 51 13 ab 8f 79 b0 2b ea 77 88 23 ff d3 52 9a 89 92 5e f3 cc 4c ed 8a d7 26 dd 88 c1 44 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.385) 0:01:41.139 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.039) 0:01:41.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.038) 0:01:41.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.038) 0:01:41.256 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.037) 0:01:41.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.033) 0:01:41.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.030) 0:01:41.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.029) 0:01:41.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.036) 0:01:41.422 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.035) 0:01:41.458 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.038) 0:01:41.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.042) 0:01:41.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.038) 0:01:41.577 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.030) 0:01:41.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.030) 0:01:41.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.030) 0:01:41.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.034) 0:01:41.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.035) 0:01:41.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.035) 0:01:41.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.034) 0:01:41.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.031) 0:01:41.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:08:07 +0000 (0:00:00.029) 0:01:41.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.029) 0:01:41.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.033) 0:01:41.934 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.033) 0:01:41.967 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.036) 0:01:42.004 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.036) 0:01:42.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.033) 0:01:42.074 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.033) 0:01:42.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.035) 0:01:42.143 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.090) 0:01:42.233 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.034) 0:01:42.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.030) 0:01:42.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.030) 0:01:42.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.029) 0:01:42.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.030) 0:01:42.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.033) 0:01:42.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.030) 0:01:42.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.031) 0:01:42.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.030) 0:01:42.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.029) 0:01:42.544 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:176 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.028) 0:01:42.573 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.051) 0:01:42.625 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:08:08 +0000 (0:00:00.044) 0:01:42.669 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:08:09 +0000 (0:00:00.523) 0:01:43.192 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:08:09 +0000 (0:00:00.072) 0:01:43.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:08:09 +0000 (0:00:00.032) 0:01:43.297 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:08:09 +0000 (0:00:00.031) 0:01:43.329 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:08:09 +0000 (0:00:00.064) 0:01:43.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:08:09 +0000 (0:00:00.026) 0:01:43.419 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:08:10 +0000 (0:00:00.949) 0:01:44.369 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:08:10 +0000 (0:00:00.039) 0:01:44.408 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:08:10 +0000 (0:00:00.035) 0:01:44.444 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:08:11 +0000 (0:00:01.094) 0:01:45.539 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:08:11 +0000 (0:00:00.058) 0:01:45.597 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:08:11 +0000 (0:00:00.030) 0:01:45.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:08:11 +0000 (0:00:00.031) 0:01:45.659 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:08:11 +0000 (0:00:00.031) 0:01:45.691 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:08:12 +0000 (0:00:00.851) 0:01:46.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:08:14 +0000 (0:00:01.783) 0:01:48.326 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:08:14 +0000 (0:00:00.045) 0:01:48.372 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:08:14 +0000 (0:00:00.030) 0:01:48.402 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:08:15 +0000 (0:00:01.124) 0:01:49.527 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'partition', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:08:15 +0000 (0:00:00.044) 0:01:49.571 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:197 Wednesday 01 June 2022 17:08:15 +0000 (0:00:00.028) 0:01:49.599 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:203 Wednesday 01 June 2022 17:08:15 +0000 (0:00:00.035) 0:01:49.635 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:210 Wednesday 01 June 2022 17:08:15 +0000 (0:00:00.034) 0:01:49.670 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:08:15 +0000 (0:00:00.049) 0:01:49.719 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:08:15 +0000 (0:00:00.047) 0:01:49.767 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:08:16 +0000 (0:00:00.522) 0:01:50.289 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:08:16 +0000 (0:00:00.074) 0:01:50.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:08:16 +0000 (0:00:00.031) 0:01:50.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:08:16 +0000 (0:00:00.032) 0:01:50.428 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:08:16 +0000 (0:00:00.064) 0:01:50.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:08:16 +0000 (0:00:00.033) 0:01:50.526 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:08:17 +0000 (0:00:00.891) 0:01:51.417 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:08:17 +0000 (0:00:00.038) 0:01:51.456 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:08:17 +0000 (0:00:00.033) 0:01:51.490 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:08:18 +0000 (0:00:01.065) 0:01:52.555 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:08:18 +0000 (0:00:00.056) 0:01:52.612 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:08:18 +0000 (0:00:00.029) 0:01:52.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:08:18 +0000 (0:00:00.031) 0:01:52.673 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:08:18 +0000 (0:00:00.038) 0:01:52.711 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:08:19 +0000 (0:00:00.828) 0:01:53.539 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:08:21 +0000 (0:00:01.782) 0:01:55.322 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:08:21 +0000 (0:00:00.049) 0:01:55.372 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:08:21 +0000 (0:00:00.029) 0:01:55.401 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-de641b06-8ece-4140-bb76-6d4558d85b59", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:08:31 +0000 (0:00:09.790) 0:02:05.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:08:31 +0000 (0:00:00.030) 0:02:05.223 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:08:31 +0000 (0:00:00.030) 0:02:05.253 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-de641b06-8ece-4140-bb76-6d4558d85b59", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:08:31 +0000 (0:00:00.044) 0:02:05.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:08:31 +0000 (0:00:00.038) 0:02:05.336 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:08:31 +0000 (0:00:00.035) 0:02:05.371 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:08:31 +0000 (0:00:00.388) 0:02:05.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:08:32 +0000 (0:00:00.662) 0:02:06.422 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:08:32 +0000 (0:00:00.431) 0:02:06.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:08:33 +0000 (0:00:00.674) 0:02:07.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103284.3591216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e6eaef7a62a6475d9f0497337703e035b52b2b14", "ctime": 1654103282.2701216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21708, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103282.2681215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "905287677", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:08:34 +0000 (0:00:00.412) 0:02:07.941 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-de641b06-8ece-4140-bb76-6d4558d85b59", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:08:34 +0000 (0:00:00.752) 0:02:08.694 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:227 Wednesday 01 June 2022 17:08:35 +0000 (0:00:00.846) 0:02:09.540 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:08:35 +0000 (0:00:00.052) 0:02:09.592 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:08:35 +0000 (0:00:00.044) 0:02:09.637 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:08:35 +0000 (0:00:00.030) 0:02:09.667 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "size": "10G", "type": "crypt", "uuid": "6db48060-a401-4730-955d-6fab78544743" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "de641b06-8ece-4140-bb76-6d4558d85b59" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:08:36 +0000 (0:00:00.389) 0:02:10.057 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003482", "end": "2022-06-01 13:08:35.932793", "rc": 0, "start": "2022-06-01 13:08:35.929311" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:08:36 +0000 (0:00:00.386) 0:02:10.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003031", "end": "2022-06-01 13:08:36.312605", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:08:36.309574" } STDOUT: luks-de641b06-8ece-4140-bb76-6d4558d85b59 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:08:36 +0000 (0:00:00.380) 0:02:10.824 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.067) 0:02:10.891 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.087) 0:02:10.978 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.068) 0:02:11.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.031) 0:02:11.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.032) 0:02:11.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.034) 0:02:11.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.030) 0:02:11.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.029) 0:02:11.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.030) 0:02:11.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.030) 0:02:11.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.031) 0:02:11.297 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.060) 0:02:11.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.030) 0:02:11.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.030) 0:02:11.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.030) 0:02:11.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.032) 0:02:11.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.035) 0:02:11.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.035) 0:02:11.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.031) 0:02:11.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.031) 0:02:11.616 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.060) 0:02:11.676 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.042) 0:02:11.718 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.082) 0:02:11.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:08:37 +0000 (0:00:00.049) 0:02:11.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.036) 0:02:11.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.033) 0:02:11.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.031) 0:02:11.952 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.067) 0:02:12.020 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.039) 0:02:12.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.031) 0:02:12.091 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.059) 0:02:12.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.036) 0:02:12.188 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.125) 0:02:12.313 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.035) 0:02:12.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "6db48060-a401-4730-955d-6fab78544743" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "6db48060-a401-4730-955d-6fab78544743" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.042) 0:02:12.392 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.037) 0:02:12.430 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.035) 0:02:12.465 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.038) 0:02:12.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.029) 0:02:12.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.029) 0:02:12.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.029) 0:02:12.593 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.033) 0:02:12.626 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.049) 0:02:12.676 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.035) 0:02:12.711 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.037) 0:02:12.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.030) 0:02:12.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.032) 0:02:12.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:08:38 +0000 (0:00:00.038) 0:02:12.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.042) 0:02:12.892 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103310.4481215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103310.4481215, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 14403, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103310.4481215, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.396) 0:02:13.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.039) 0:02:13.328 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.037) 0:02:13.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.032) 0:02:13.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.030) 0:02:13.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.035) 0:02:13.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103310.6031215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103310.6031215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 14448, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103310.6031215, "nlink": 1, "path": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:08:39 +0000 (0:00:00.378) 0:02:13.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.012473", "end": "2022-06-01 13:08:39.726160", "rc": 0, "start": "2022-06-01 13:08:39.713687" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: de641b06-8ece-4140-bb76-6d4558d85b59 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 10 Memory: 906462 Threads: 4 Salt: be 32 5b 91 36 a2 b9 47 73 f9 56 2a 00 44 1b be ca 96 ec 61 40 3d 00 3e ec 9d b9 fd f8 ad a3 3e AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 96946 Salt: f2 a7 25 37 8e e0 00 28 a9 8b 01 9a 0b 3f 1f f9 84 1b 6e 9e ce d0 b9 b2 6a dd a1 d3 ba 28 41 99 Digest: 95 f1 1f c6 08 fb 58 1c ee 1f f1 c9 38 c6 de 66 f0 8d 39 22 2f 10 dc ed 41 a6 f1 c3 f9 e3 f2 95 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.398) 0:02:14.240 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.045) 0:02:14.286 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.041) 0:02:14.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.038) 0:02:14.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.038) 0:02:14.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.032) 0:02:14.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.031) 0:02:14.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.030) 0:02:14.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-de641b06-8ece-4140-bb76-6d4558d85b59 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.043) 0:02:14.543 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.037) 0:02:14.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.040) 0:02:14.622 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.038) 0:02:14.660 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.037) 0:02:14.698 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.031) 0:02:14.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.034) 0:02:14.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.030) 0:02:14.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.030) 0:02:14.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:08:40 +0000 (0:00:00.031) 0:02:14.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.032) 0:02:14.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.031) 0:02:14.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.033) 0:02:14.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.031) 0:02:14.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.029) 0:02:15.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.028) 0:02:15.046 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.034) 0:02:15.080 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.031) 0:02:15.112 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.033) 0:02:15.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.031) 0:02:15.177 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.030) 0:02:15.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.030) 0:02:15.238 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.034) 0:02:15.273 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.034) 0:02:15.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.035) 0:02:15.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.030) 0:02:15.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.029) 0:02:15.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.030) 0:02:15.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.029) 0:02:15.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.030) 0:02:15.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.032) 0:02:15.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.031) 0:02:15.558 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.032) 0:02:15.590 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.031) 0:02:15.621 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.029) 0:02:15.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:08:41 +0000 (0:00:00.030) 0:02:15.682 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:233 Wednesday 01 June 2022 17:08:42 +0000 (0:00:00.398) 0:02:16.080 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:08:42 +0000 (0:00:00.064) 0:02:16.144 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:08:42 +0000 (0:00:00.046) 0:02:16.191 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:08:42 +0000 (0:00:00.532) 0:02:16.723 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:08:42 +0000 (0:00:00.086) 0:02:16.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:08:42 +0000 (0:00:00.034) 0:02:16.844 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:08:43 +0000 (0:00:00.032) 0:02:16.876 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:08:43 +0000 (0:00:00.065) 0:02:16.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:08:43 +0000 (0:00:00.029) 0:02:16.972 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:08:44 +0000 (0:00:00.904) 0:02:17.877 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:08:44 +0000 (0:00:00.037) 0:02:17.915 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:08:44 +0000 (0:00:00.035) 0:02:17.950 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:08:45 +0000 (0:00:01.179) 0:02:19.129 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:08:45 +0000 (0:00:00.058) 0:02:19.187 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:08:45 +0000 (0:00:00.029) 0:02:19.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:08:45 +0000 (0:00:00.031) 0:02:19.248 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:08:45 +0000 (0:00:00.029) 0:02:19.277 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:08:46 +0000 (0:00:00.824) 0:02:20.102 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service": { "name": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:08:47 +0000 (0:00:01.697) 0:02:21.799 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:08:47 +0000 (0:00:00.048) 0:02:21.848 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dc13e2508\x2da9f2\x2d4f10\x2db7d8\x2d4d8b6ed5a4fd.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "name": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket cryptsetup-pre.target dev-sda.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2dc13e2508\\\\x2da9f2\\\\x2d4f10\\\\x2db7d8\\\\x2d4d8b6ed5a4fd.target\" cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c13e2508-a9f2-4f10-b7d8-4d8b6ed5a4fd ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dc13e2508\\\\x2da9f2\\\\x2d4f10\\\\x2db7d8\\\\x2d4d8b6ed5a4fd.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:08:32 EDT", "StateChangeTimestampMonotonic": "2902017180", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dc13e2508\\\\x2da9f2\\\\x2d4f10\\\\x2db7d8\\\\x2d4d8b6ed5a4fd.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:08:48 +0000 (0:00:00.705) 0:02:22.553 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-de641b06-8ece-4140-bb76-6d4558d85b59' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:08:49 +0000 (0:00:01.160) 0:02:23.714 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'partition', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-de641b06-8ece-4140-bb76-6d4558d85b59' in safe mode due to encryption removal"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:08:49 +0000 (0:00:00.043) 0:02:23.758 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dc13e2508\x2da9f2\x2d4f10\x2db7d8\x2d4d8b6ed5a4fd.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "name": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dc13e2508\\x2da9f2\\x2d4f10\\x2db7d8\\x2d4d8b6ed5a4fd.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dc13e2508\\\\x2da9f2\\\\x2d4f10\\\\x2db7d8\\\\x2d4d8b6ed5a4fd.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:252 Wednesday 01 June 2022 17:08:50 +0000 (0:00:00.677) 0:02:24.435 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:258 Wednesday 01 June 2022 17:08:50 +0000 (0:00:00.038) 0:02:24.473 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:08:50 +0000 (0:00:00.038) 0:02:24.512 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103321.5721216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103321.5721216, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103321.5721216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "910299505", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.386) 0:02:24.898 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:269 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.037) 0:02:24.935 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.051) 0:02:24.987 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.046) 0:02:25.033 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.530) 0:02:25.564 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.074) 0:02:25.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.033) 0:02:25.671 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.032) 0:02:25.704 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.065) 0:02:25.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:08:51 +0000 (0:00:00.027) 0:02:25.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:08:52 +0000 (0:00:00.840) 0:02:26.637 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:08:52 +0000 (0:00:00.039) 0:02:26.677 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:08:52 +0000 (0:00:00.033) 0:02:26.710 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:08:54 +0000 (0:00:01.234) 0:02:27.945 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:08:54 +0000 (0:00:00.054) 0:02:28.000 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:08:54 +0000 (0:00:00.030) 0:02:28.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:08:54 +0000 (0:00:00.030) 0:02:28.060 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:08:54 +0000 (0:00:00.027) 0:02:28.088 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:08:55 +0000 (0:00:00.873) 0:02:28.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service": { "name": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:08:56 +0000 (0:00:01.743) 0:02:30.705 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:08:56 +0000 (0:00:00.090) 0:02:30.795 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dde641b06\x2d8ece\x2d4140\x2dbb76\x2d6d4558d85b59.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "name": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.target\" cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-de641b06-8ece-4140-bb76-6d4558d85b59", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-de641b06-8ece-4140-bb76-6d4558d85b59 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-de641b06-8ece-4140-bb76-6d4558d85b59 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-de641b06-8ece-4140-bb76-6d4558d85b59 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-de641b06-8ece-4140-bb76-6d4558d85b59 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:08:49 EDT", "StateChangeTimestampMonotonic": "2918929875", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:08:57 +0000 (0:00:00.710) 0:02:31.506 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-de641b06-8ece-4140-bb76-6d4558d85b59", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:08:59 +0000 (0:00:01.685) 0:02:33.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:08:59 +0000 (0:00:00.032) 0:02:33.224 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dde641b06\x2d8ece\x2d4140\x2dbb76\x2d6d4558d85b59.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "name": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:08:49 EDT", "StateChangeTimestampMonotonic": "2918929875", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:09:00 +0000 (0:00:00.705) 0:02:33.929 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-de641b06-8ece-4140-bb76-6d4558d85b59", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:09:00 +0000 (0:00:00.044) 0:02:33.973 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:09:00 +0000 (0:00:00.038) 0:02:34.012 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:09:00 +0000 (0:00:00.035) 0:02:34.048 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-de641b06-8ece-4140-bb76-6d4558d85b59" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:09:00 +0000 (0:00:00.419) 0:02:34.467 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:09:01 +0000 (0:00:00.676) 0:02:35.144 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:09:01 +0000 (0:00:00.428) 0:02:35.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:09:02 +0000 (0:00:00.661) 0:02:36.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103316.3121216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "bf21b694ae1ac5e475ba96b298d5939504e19f14", "ctime": 1654103314.1811216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8730271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103314.1801214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "336784171", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:09:02 +0000 (0:00:00.404) 0:02:36.638 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-de641b06-8ece-4140-bb76-6d4558d85b59', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-de641b06-8ece-4140-bb76-6d4558d85b59", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:09:03 +0000 (0:00:00.407) 0:02:37.046 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:286 Wednesday 01 June 2022 17:09:04 +0000 (0:00:00.865) 0:02:37.911 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:09:04 +0000 (0:00:00.049) 0:02:37.961 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:09:04 +0000 (0:00:00.043) 0:02:38.004 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:09:04 +0000 (0:00:00.031) 0:02:38.036 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "ea9aff3d-4787-4a37-8a93-518ba8010d4a" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:09:04 +0000 (0:00:00.396) 0:02:38.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003417", "end": "2022-06-01 13:09:04.297279", "rc": 0, "start": "2022-06-01 13:09:04.293862" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:09:04 +0000 (0:00:00.424) 0:02:38.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002959", "end": "2022-06-01 13:09:04.750875", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:09:04.747916" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.402) 0:02:39.259 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.066) 0:02:39.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.031) 0:02:39.356 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.062) 0:02:39.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.031) 0:02:39.450 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.028) 0:02:39.478 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.028) 0:02:39.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.031) 0:02:39.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.032) 0:02:39.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.031) 0:02:39.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.031) 0:02:39.633 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.027) 0:02:39.661 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.060) 0:02:39.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.032) 0:02:39.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.031) 0:02:39.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.030) 0:02:39.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:09:05 +0000 (0:00:00.030) 0:02:39.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.030) 0:02:39.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.032) 0:02:39.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.031) 0:02:39.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.031) 0:02:39.972 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.059) 0:02:40.031 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'cache_size': 0, u'_mount_id': u'UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.041) 0:02:40.073 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.058) 0:02:40.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.036) 0:02:40.168 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.036) 0:02:40.204 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.032) 0:02:40.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.035) 0:02:40.272 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.064) 0:02:40.337 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'cache_size': 0, u'_mount_id': u'UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.042) 0:02:40.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.037) 0:02:40.417 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.063) 0:02:40.480 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.040) 0:02:40.521 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.123) 0:02:40.645 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.038) 0:02:40.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592102, "block_size": 4096, "block_total": 2618624, "block_used": 26522, "device": "/dev/sda1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617249792, "size_total": 10725883904, "uuid": "ea9aff3d-4787-4a37-8a93-518ba8010d4a" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592102, "block_size": 4096, "block_total": 2618624, "block_used": 26522, "device": "/dev/sda1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617249792, "size_total": 10725883904, "uuid": "ea9aff3d-4787-4a37-8a93-518ba8010d4a" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.041) 0:02:40.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.038) 0:02:40.763 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.035) 0:02:40.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.037) 0:02:40.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:09:06 +0000 (0:00:00.028) 0:02:40.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.032) 0:02:40.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.081) 0:02:40.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.032) 0:02:41.012 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.044) 0:02:41.056 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.033) 0:02:41.089 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.033) 0:02:41.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.028) 0:02:41.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.031) 0:02:41.183 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.034) 0:02:41.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.039) 0:02:41.258 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103338.6071215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103338.6071215, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 14403, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103338.6071215, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.389) 0:02:41.647 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.036) 0:02:41.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.038) 0:02:41.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.039) 0:02:41.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.033) 0:02:41.795 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.036) 0:02:41.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:09:07 +0000 (0:00:00.030) 0:02:41.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:41.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.031) 0:02:41.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.037) 0:02:41.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.033) 0:02:41.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:42.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:42.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:42.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.029) 0:02:42.117 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.037) 0:02:42.154 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.037) 0:02:42.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:42.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.029) 0:02:42.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:42.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.032) 0:02:42.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.031) 0:02:42.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.033) 0:02:42.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.031) 0:02:42.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.031) 0:02:42.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:42.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.030) 0:02:42.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.032) 0:02:42.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.034) 0:02:42.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.031) 0:02:42.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.032) 0:02:42.634 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.034) 0:02:42.668 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.033) 0:02:42.701 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.032) 0:02:42.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.035) 0:02:42.770 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.032) 0:02:42.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.032) 0:02:42.834 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:09:08 +0000 (0:00:00.035) 0:02:42.870 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.032) 0:02:42.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.030) 0:02:42.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.033) 0:02:42.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.030) 0:02:42.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.031) 0:02:43.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.031) 0:02:43.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.030) 0:02:43.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.029) 0:02:43.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.032) 0:02:43.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.032) 0:02:43.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.031) 0:02:43.216 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.028) 0:02:43.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.031) 0:02:43.276 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:292 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.372) 0:02:43.648 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.050) 0:02:43.698 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:09:09 +0000 (0:00:00.044) 0:02:43.743 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:09:10 +0000 (0:00:00.751) 0:02:44.495 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:09:10 +0000 (0:00:00.073) 0:02:44.568 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:09:10 +0000 (0:00:00.032) 0:02:44.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:09:10 +0000 (0:00:00.033) 0:02:44.634 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:09:10 +0000 (0:00:00.064) 0:02:44.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:09:10 +0000 (0:00:00.025) 0:02:44.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:09:11 +0000 (0:00:00.863) 0:02:45.587 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:09:11 +0000 (0:00:00.040) 0:02:45.627 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:09:11 +0000 (0:00:00.033) 0:02:45.661 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:09:12 +0000 (0:00:01.208) 0:02:46.869 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:09:13 +0000 (0:00:00.060) 0:02:46.929 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:09:13 +0000 (0:00:00.029) 0:02:46.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:09:13 +0000 (0:00:00.032) 0:02:46.992 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:09:13 +0000 (0:00:00.028) 0:02:47.020 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:09:13 +0000 (0:00:00.806) 0:02:47.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service": { "name": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:09:15 +0000 (0:00:01.686) 0:02:49.514 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:09:15 +0000 (0:00:00.044) 0:02:49.558 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dde641b06\x2d8ece\x2d4140\x2dbb76\x2d6d4558d85b59.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "name": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-de641b06-8ece-4140-bb76-6d4558d85b59", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-de641b06-8ece-4140-bb76-6d4558d85b59 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-de641b06-8ece-4140-bb76-6d4558d85b59 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-de641b06-8ece-4140-bb76-6d4558d85b59 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-de641b06-8ece-4140-bb76-6d4558d85b59 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:08:49 EDT", "StateChangeTimestampMonotonic": "2918929875", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:09:16 +0000 (0:00:00.720) 0:02:50.279 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:09:17 +0000 (0:00:01.179) 0:02:51.458 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'partition', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'partition', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:09:17 +0000 (0:00:00.045) 0:02:51.504 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dde641b06\x2d8ece\x2d4140\x2dbb76\x2d6d4558d85b59.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "name": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dde641b06\\x2d8ece\\x2d4140\\x2dbb76\\x2d6d4558d85b59.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dde641b06\\\\x2d8ece\\\\x2d4140\\\\x2dbb76\\\\x2d6d4558d85b59.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:313 Wednesday 01 June 2022 17:09:18 +0000 (0:00:00.720) 0:02:52.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:319 Wednesday 01 June 2022 17:09:18 +0000 (0:00:00.036) 0:02:52.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:09:18 +0000 (0:00:00.034) 0:02:52.295 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103349.1421216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103349.1421216, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103349.1421216, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3842660007", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:09:18 +0000 (0:00:00.385) 0:02:52.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:332 Wednesday 01 June 2022 17:09:18 +0000 (0:00:00.039) 0:02:52.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testq2pyi9i7lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:339 Wednesday 01 June 2022 17:09:19 +0000 (0:00:00.535) 0:02:53.255 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testq2pyi9i7lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1654103359.44-90129-170255405984902/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:346 Wednesday 01 June 2022 17:09:20 +0000 (0:00:00.802) 0:02:54.058 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:09:20 +0000 (0:00:00.051) 0:02:54.110 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:09:20 +0000 (0:00:00.045) 0:02:54.155 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:09:20 +0000 (0:00:00.514) 0:02:54.670 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:09:20 +0000 (0:00:00.073) 0:02:54.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:09:20 +0000 (0:00:00.031) 0:02:54.775 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:09:20 +0000 (0:00:00.036) 0:02:54.812 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:09:21 +0000 (0:00:00.076) 0:02:54.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:09:21 +0000 (0:00:00.027) 0:02:54.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:09:21 +0000 (0:00:00.909) 0:02:55.825 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testq2pyi9i7lukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:09:21 +0000 (0:00:00.041) 0:02:55.867 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:09:22 +0000 (0:00:00.035) 0:02:55.902 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:09:23 +0000 (0:00:01.151) 0:02:57.054 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:09:23 +0000 (0:00:00.057) 0:02:57.112 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:09:23 +0000 (0:00:00.030) 0:02:57.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:09:23 +0000 (0:00:00.031) 0:02:57.174 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:09:23 +0000 (0:00:00.029) 0:02:57.203 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:09:24 +0000 (0:00:00.836) 0:02:58.040 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:09:25 +0000 (0:00:01.731) 0:02:59.772 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:09:25 +0000 (0:00:00.047) 0:02:59.819 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:09:25 +0000 (0:00:00.028) 0:02:59.848 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-cb442646-6655-4ff9-af7a-405783893335", "password": "/tmp/storage_testq2pyi9i7lukskey", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testq2pyi9i7lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:09:36 +0000 (0:00:10.597) 0:03:10.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:09:36 +0000 (0:00:00.031) 0:03:10.477 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:09:36 +0000 (0:00:00.095) 0:03:10.572 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-cb442646-6655-4ff9-af7a-405783893335", "password": "/tmp/storage_testq2pyi9i7lukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testq2pyi9i7lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:09:36 +0000 (0:00:00.044) 0:03:10.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testq2pyi9i7lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:09:36 +0000 (0:00:00.040) 0:03:10.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:09:36 +0000 (0:00:00.037) 0:03:10.695 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ea9aff3d-4787-4a37-8a93-518ba8010d4a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:09:37 +0000 (0:00:00.413) 0:03:11.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:09:38 +0000 (0:00:00.870) 0:03:11.980 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:09:38 +0000 (0:00:00.420) 0:03:12.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:09:39 +0000 (0:00:00.677) 0:03:13.078 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103344.7501216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103342.5281215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21708, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103342.5271215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "767492423", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:09:39 +0000 (0:00:00.376) 0:03:13.454 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'/tmp/storage_testq2pyi9i7lukskey', u'name': u'luks-cb442646-6655-4ff9-af7a-405783893335', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-cb442646-6655-4ff9-af7a-405783893335", "password": "/tmp/storage_testq2pyi9i7lukskey", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:09:39 +0000 (0:00:00.400) 0:03:13.855 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:363 Wednesday 01 June 2022 17:09:40 +0000 (0:00:00.848) 0:03:14.703 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:09:40 +0000 (0:00:00.049) 0:03:14.752 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testq2pyi9i7lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:09:40 +0000 (0:00:00.043) 0:03:14.795 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:09:40 +0000 (0:00:00.031) 0:03:14.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "size": "10G", "type": "crypt", "uuid": "a378f9f1-663a-43eb-9b43-cdf6ebae83b3" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cb442646-6655-4ff9-af7a-405783893335" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:09:41 +0000 (0:00:00.381) 0:03:15.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002915", "end": "2022-06-01 13:09:41.069063", "rc": 0, "start": "2022-06-01 13:09:41.066148" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:09:41 +0000 (0:00:00.372) 0:03:15.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002666", "end": "2022-06-01 13:09:41.437616", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:09:41.434950" } STDOUT: luks-cb442646-6655-4ff9-af7a-405783893335 /dev/sda1 /tmp/storage_testq2pyi9i7lukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.368) 0:03:15.949 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.067) 0:03:16.017 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.033) 0:03:16.050 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.066) 0:03:16.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.074) 0:03:16.192 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.030) 0:03:16.222 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.029) 0:03:16.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.031) 0:03:16.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.031) 0:03:16.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.031) 0:03:16.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.034) 0:03:16.382 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.033) 0:03:16.415 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.060) 0:03:16.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.031) 0:03:16.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.045) 0:03:16.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.035) 0:03:16.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.033) 0:03:16.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.032) 0:03:16.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.033) 0:03:16.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.033) 0:03:16.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.044) 0:03:16.767 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:09:42 +0000 (0:00:00.061) 0:03:16.828 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'/tmp/storage_testq2pyi9i7lukskey', u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testq2pyi9i7lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.043) 0:03:16.872 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.065) 0:03:16.937 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.038) 0:03:16.975 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.030) 0:03:17.006 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.029) 0:03:17.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.033) 0:03:17.069 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.066) 0:03:17.135 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'/tmp/storage_testq2pyi9i7lukskey', u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testq2pyi9i7lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.040) 0:03:17.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.032) 0:03:17.208 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.060) 0:03:17.269 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.037) 0:03:17.307 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.128) 0:03:17.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.037) 0:03:17.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "a378f9f1-663a-43eb-9b43-cdf6ebae83b3" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2588034, "block_size": 4096, "block_total": 2614528, "block_used": 26494, "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fstype": "xfs", "inode_available": 5234173, "inode_total": 5234176, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10600587264, "size_total": 10709106688, "uuid": "a378f9f1-663a-43eb-9b43-cdf6ebae83b3" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.043) 0:03:17.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.040) 0:03:17.557 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.035) 0:03:17.592 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.039) 0:03:17.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.030) 0:03:17.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.030) 0:03:17.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.030) 0:03:17.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.034) 0:03:17.758 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.048) 0:03:17.806 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:09:43 +0000 (0:00:00.035) 0:03:17.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.036) 0:03:17.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.031) 0:03:17.910 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.034) 0:03:17.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.043) 0:03:17.988 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.039) 0:03:18.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103375.6961215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103375.6961215, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 14403, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103375.6961215, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.434) 0:03:18.462 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.040) 0:03:18.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.039) 0:03:18.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.034) 0:03:18.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.032) 0:03:18.609 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:09:44 +0000 (0:00:00.039) 0:03:18.648 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103375.8541214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103375.8541214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 14893, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103375.8541214, "nlink": 1, "path": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.379) 0:03:19.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.012032", "end": "2022-06-01 13:09:44.914756", "rc": 0, "start": "2022-06-01 13:09:44.902724" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: cb442646-6655-4ff9-af7a-405783893335 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 10 Memory: 906462 Threads: 4 Salt: 35 da fd d0 ec 23 04 6e f3 83 82 39 6b a8 41 70 79 c6 e1 09 9f 60 7d dd 47 e4 07 f5 1b ae f1 83 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 96234 Salt: a3 bb 2d 31 dc b4 4f 0e ce 54 1f b5 a2 91 f2 7d 05 e0 07 3f b4 4c 82 f6 45 97 a5 f6 69 ff 4f e0 Digest: 7d 85 86 3c f5 5b 75 7c e1 e5 05 24 c5 07 c3 66 e1 b4 d9 9b 4a fc 5d 90 37 0f a8 ed 25 88 6e 6d TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.401) 0:03:19.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.043) 0:03:19.472 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.037) 0:03:19.510 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.038) 0:03:19.549 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.038) 0:03:19.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.031) 0:03:19.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.035) 0:03:19.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.032) 0:03:19.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-cb442646-6655-4ff9-af7a-405783893335 /dev/sda1 /tmp/storage_testq2pyi9i7lukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testq2pyi9i7lukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.040) 0:03:19.727 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.036) 0:03:19.764 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.041) 0:03:19.805 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:09:45 +0000 (0:00:00.039) 0:03:19.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.041) 0:03:19.887 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.033) 0:03:19.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:19.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:19.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.031) 0:03:20.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.031) 0:03:20.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.034) 0:03:20.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:20.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:20.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:20.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.031) 0:03:20.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.033) 0:03:20.244 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.036) 0:03:20.280 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.033) 0:03:20.313 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.036) 0:03:20.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.035) 0:03:20.385 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.037) 0:03:20.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.035) 0:03:20.458 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.044) 0:03:20.503 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.038) 0:03:20.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:20.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.033) 0:03:20.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.031) 0:03:20.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.033) 0:03:20.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.036) 0:03:20.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:20.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:20.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.030) 0:03:20.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.032) 0:03:20.836 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:09:46 +0000 (0:00:00.034) 0:03:20.870 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:09:47 +0000 (0:00:00.033) 0:03:20.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:365 Wednesday 01 June 2022 17:09:47 +0000 (0:00:00.032) 0:03:20.937 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "path": "/tmp/storage_testq2pyi9i7lukskey", "state": "absent" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:377 Wednesday 01 June 2022 17:09:47 +0000 (0:00:00.460) 0:03:21.397 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:09:47 +0000 (0:00:00.053) 0:03:21.451 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:09:47 +0000 (0:00:00.047) 0:03:21.499 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:09:48 +0000 (0:00:00.568) 0:03:22.067 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:09:48 +0000 (0:00:00.070) 0:03:22.138 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:09:48 +0000 (0:00:00.032) 0:03:22.170 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:09:48 +0000 (0:00:00.031) 0:03:22.202 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:09:48 +0000 (0:00:00.064) 0:03:22.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:09:48 +0000 (0:00:00.027) 0:03:22.294 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:09:49 +0000 (0:00:00.930) 0:03:23.225 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:09:49 +0000 (0:00:00.040) 0:03:23.265 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:09:49 +0000 (0:00:00.034) 0:03:23.299 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:09:50 +0000 (0:00:01.155) 0:03:24.455 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:09:50 +0000 (0:00:00.056) 0:03:24.512 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:09:50 +0000 (0:00:00.036) 0:03:24.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:09:50 +0000 (0:00:00.033) 0:03:24.582 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:09:50 +0000 (0:00:00.030) 0:03:24.612 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:09:51 +0000 (0:00:00.815) 0:03:25.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:09:53 +0000 (0:00:01.699) 0:03:27.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:09:53 +0000 (0:00:00.049) 0:03:27.176 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:09:53 +0000 (0:00:00.029) 0:03:27.206 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:09:54 +0000 (0:00:01.253) 0:03:28.459 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:09:54 +0000 (0:00:00.043) 0:03:28.503 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:397 Wednesday 01 June 2022 17:09:54 +0000 (0:00:00.029) 0:03:28.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:403 Wednesday 01 June 2022 17:09:54 +0000 (0:00:00.036) 0:03:28.569 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:410 Wednesday 01 June 2022 17:09:54 +0000 (0:00:00.035) 0:03:28.605 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:09:54 +0000 (0:00:00.053) 0:03:28.658 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:09:54 +0000 (0:00:00.043) 0:03:28.702 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:09:55 +0000 (0:00:00.560) 0:03:29.263 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:09:55 +0000 (0:00:00.073) 0:03:29.337 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:09:55 +0000 (0:00:00.033) 0:03:29.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:09:55 +0000 (0:00:00.031) 0:03:29.402 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:09:55 +0000 (0:00:00.063) 0:03:29.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:09:55 +0000 (0:00:00.026) 0:03:29.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:09:56 +0000 (0:00:00.869) 0:03:30.362 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:09:56 +0000 (0:00:00.039) 0:03:30.401 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:09:56 +0000 (0:00:00.035) 0:03:30.437 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:09:57 +0000 (0:00:01.163) 0:03:31.600 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:09:57 +0000 (0:00:00.057) 0:03:31.658 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:09:57 +0000 (0:00:00.028) 0:03:31.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:09:57 +0000 (0:00:00.032) 0:03:31.720 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:09:57 +0000 (0:00:00.040) 0:03:31.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:09:58 +0000 (0:00:00.820) 0:03:32.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:10:00 +0000 (0:00:01.685) 0:03:34.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:10:00 +0000 (0:00:00.051) 0:03:34.317 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:10:00 +0000 (0:00:00.029) 0:03:34.347 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-cb442646-6655-4ff9-af7a-405783893335", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:10:08 +0000 (0:00:08.129) 0:03:42.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:10:08 +0000 (0:00:00.032) 0:03:42.510 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:10:08 +0000 (0:00:00.029) 0:03:42.539 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-cb442646-6655-4ff9-af7a-405783893335", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:10:08 +0000 (0:00:00.043) 0:03:42.583 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:10:08 +0000 (0:00:00.038) 0:03:42.621 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:10:08 +0000 (0:00:00.039) 0:03:42.661 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cb442646-6655-4ff9-af7a-405783893335" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:10:09 +0000 (0:00:00.399) 0:03:43.060 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:10:09 +0000 (0:00:00.686) 0:03:43.746 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:10:10 +0000 (0:00:00.424) 0:03:44.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:10:10 +0000 (0:00:00.645) 0:03:44.817 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103381.4371216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0dff5c13df60c9949e0891add9ab1c4a0909a489", "ctime": 1654103379.3321216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21709, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103379.3311214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "3080604317", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:10:11 +0000 (0:00:00.383) 0:03:45.200 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-cb442646-6655-4ff9-af7a-405783893335', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-cb442646-6655-4ff9-af7a-405783893335", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:10:12 +0000 (0:00:00.741) 0:03:45.941 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:429 Wednesday 01 June 2022 17:10:12 +0000 (0:00:00.900) 0:03:46.842 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:10:13 +0000 (0:00:00.061) 0:03:46.904 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:10:13 +0000 (0:00:00.047) 0:03:46.952 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:10:13 +0000 (0:00:00.033) 0:03:46.985 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" }, "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "size": "4G", "type": "crypt", "uuid": "870f7bcb-d5d5-4d84-a1cb-85fd4821dc7b" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:10:13 +0000 (0:00:00.398) 0:03:47.384 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002780", "end": "2022-06-01 13:10:13.265768", "rc": 0, "start": "2022-06-01 13:10:13.262988" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:10:13 +0000 (0:00:00.393) 0:03:47.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003071", "end": "2022-06-01 13:10:13.646730", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:10:13.643659" } STDOUT: luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:10:14 +0000 (0:00:00.381) 0:03:48.158 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:10:14 +0000 (0:00:00.066) 0:03:48.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:10:14 +0000 (0:00:00.033) 0:03:48.259 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:10:14 +0000 (0:00:00.067) 0:03:48.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:10:14 +0000 (0:00:00.041) 0:03:48.369 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.583) 0:03:48.953 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.042) 0:03:48.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.037) 0:03:49.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.035) 0:03:49.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.036) 0:03:49.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.036) 0:03:49.141 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.049) 0:03:49.191 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.059) 0:03:49.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.031) 0:03:49.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.031) 0:03:49.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.031) 0:03:49.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.032) 0:03:49.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.081) 0:03:49.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.033) 0:03:49.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.031) 0:03:49.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.032) 0:03:49.557 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.063) 0:03:49.620 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.064) 0:03:49.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.032) 0:03:49.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.032) 0:03:49.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.034) 0:03:49.784 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:10:15 +0000 (0:00:00.064) 0:03:49.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.037) 0:03:49.886 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.035) 0:03:49.922 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.062) 0:03:49.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.037) 0:03:50.022 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.035) 0:03:50.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.032) 0:03:50.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.030) 0:03:50.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.033) 0:03:50.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.033) 0:03:50.188 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.031) 0:03:50.220 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.064) 0:03:50.284 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.074) 0:03:50.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.042) 0:03:50.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.034) 0:03:50.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.033) 0:03:50.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.031) 0:03:50.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.035) 0:03:50.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.033) 0:03:50.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.033) 0:03:50.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.032) 0:03:50.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.033) 0:03:50.671 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.067) 0:03:50.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:10:16 +0000 (0:00:00.037) 0:03:50.776 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.126) 0:03:50.902 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.037) 0:03:50.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "870f7bcb-d5d5-4d84-a1cb-85fd4821dc7b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "870f7bcb-d5d5-4d84-a1cb-85fd4821dc7b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.044) 0:03:50.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.038) 0:03:51.022 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.035) 0:03:51.058 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.038) 0:03:51.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.029) 0:03:51.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.030) 0:03:51.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.034) 0:03:51.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.032) 0:03:51.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.049) 0:03:51.272 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.036) 0:03:51.309 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.034) 0:03:51.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.030) 0:03:51.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.094) 0:03:51.467 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.041) 0:03:51.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:10:17 +0000 (0:00:00.039) 0:03:51.549 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103407.7241216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103407.7241216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15137, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103407.7241216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:10:18 +0000 (0:00:00.385) 0:03:51.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:10:18 +0000 (0:00:00.039) 0:03:51.974 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:10:18 +0000 (0:00:00.039) 0:03:52.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:10:18 +0000 (0:00:00.035) 0:03:52.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:10:18 +0000 (0:00:00.032) 0:03:52.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:10:18 +0000 (0:00:00.037) 0:03:52.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103407.8871214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103407.8871214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15176, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103407.8871214, "nlink": 1, "path": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:10:18 +0000 (0:00:00.389) 0:03:52.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.011921", "end": "2022-06-01 13:10:18.387284", "rc": 0, "start": "2022-06-01 13:10:18.375363" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 80 aa ef a0 8f bd 71 01 11 e9 7c 95 b6 4a 4a 48 7f de 43 47 MK salt: 04 aa cf 60 a6 a2 e4 4e 4a fe 05 dc c8 09 cd 25 8e df 7e c5 92 34 fa 14 5a 37 8c 1a 24 ce a7 de MK iterations: 95812 UUID: 12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 Key Slot 0: ENABLED Iterations: 1572076 Salt: 75 34 de 46 23 6e 06 74 5e fe 9e 12 31 ff ec 18 ce c0 db 77 0d e5 fc e3 9f 0b d3 89 86 94 9f 9d Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.394) 0:03:52.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.041) 0:03:52.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.039) 0:03:52.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.039) 0:03:53.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.038) 0:03:53.062 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.041) 0:03:53.104 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.040) 0:03:53.144 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.039) 0:03:53.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.039) 0:03:53.223 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.036) 0:03:53.259 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.039) 0:03:53.299 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.043) 0:03:53.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.038) 0:03:53.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.032) 0:03:53.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.032) 0:03:53.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.032) 0:03:53.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.032) 0:03:53.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.034) 0:03:53.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.031) 0:03:53.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.031) 0:03:53.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:10:19 +0000 (0:00:00.032) 0:03:53.642 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.536) 0:03:54.178 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.368) 0:03:54.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.039) 0:03:54.586 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.037) 0:03:54.623 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.033) 0:03:54.656 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.033) 0:03:54.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.033) 0:03:54.723 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.032) 0:03:54.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.031) 0:03:54.787 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.038) 0:03:54.826 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:10:20 +0000 (0:00:00.035) 0:03:54.862 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.040) 0:03:54.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037762", "end": "2022-06-01 13:10:20.799593", "rc": 0, "start": "2022-06-01 13:10:20.761831" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.407) 0:03:55.309 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.044) 0:03:55.354 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.040) 0:03:55.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.033) 0:03:55.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.034) 0:03:55.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.034) 0:03:55.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.032) 0:03:55.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.032) 0:03:55.563 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.031) 0:03:55.595 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.028) 0:03:55.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:431 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.035) 0:03:55.658 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.105) 0:03:55.764 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:10:21 +0000 (0:00:00.058) 0:03:55.822 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:10:22 +0000 (0:00:00.516) 0:03:56.338 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:10:22 +0000 (0:00:00.087) 0:03:56.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:10:22 +0000 (0:00:00.036) 0:03:56.463 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:10:22 +0000 (0:00:00.033) 0:03:56.496 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:10:22 +0000 (0:00:00.070) 0:03:56.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:10:22 +0000 (0:00:00.027) 0:03:56.595 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:10:23 +0000 (0:00:00.926) 0:03:57.521 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:10:23 +0000 (0:00:00.042) 0:03:57.563 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:10:23 +0000 (0:00:00.037) 0:03:57.601 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:10:25 +0000 (0:00:01.313) 0:03:58.914 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:10:25 +0000 (0:00:00.058) 0:03:58.972 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:10:25 +0000 (0:00:00.031) 0:03:59.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:10:25 +0000 (0:00:00.031) 0:03:59.035 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:10:25 +0000 (0:00:00.030) 0:03:59.066 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:10:26 +0000 (0:00:00.899) 0:03:59.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service": { "name": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:10:27 +0000 (0:00:01.691) 0:04:01.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:10:27 +0000 (0:00:00.049) 0:04:01.707 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dcb442646\x2d6655\x2d4ff9\x2daf7a\x2d405783893335.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "name": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "-.mount systemd-udevd-kernel.socket tmp.mount dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2dcb442646\\\\x2d6655\\\\x2d4ff9\\\\x2daf7a\\\\x2d405783893335.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-cb442646-6655-4ff9-af7a-405783893335", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-cb442646-6655-4ff9-af7a-405783893335 /dev/sda1 /tmp/storage_testq2pyi9i7lukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-cb442646-6655-4ff9-af7a-405783893335 /dev/sda1 /tmp/storage_testq2pyi9i7lukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-cb442646-6655-4ff9-af7a-405783893335 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-cb442646-6655-4ff9-af7a-405783893335 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dcb442646\\\\x2d6655\\\\x2d4ff9\\\\x2daf7a\\\\x2d405783893335.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount \"system-systemd\\\\x2dcryptsetup.slice\"", "RequiresMountsFor": "/tmp/storage_testq2pyi9i7lukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:10:10 EDT", "StateChangeTimestampMonotonic": "2999321957", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2dcb442646\\\\x2d6655\\\\x2d4ff9\\\\x2daf7a\\\\x2d405783893335.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:10:28 +0000 (0:00:00.706) 0:04:02.413 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:10:29 +0000 (0:00:01.297) 0:04:03.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:10:29 +0000 (0:00:00.031) 0:04:03.743 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2dcb442646\x2d6655\x2d4ff9\x2daf7a\x2d405783893335.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "name": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dcb442646\\x2d6655\\x2d4ff9\\x2daf7a\\x2d405783893335.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2dcb442646\\\\x2d6655\\\\x2d4ff9\\\\x2daf7a\\\\x2d405783893335.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:10:30 +0000 (0:00:00.686) 0:04:04.430 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:10:30 +0000 (0:00:00.047) 0:04:04.477 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:10:30 +0000 (0:00:00.045) 0:04:04.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:10:30 +0000 (0:00:00.038) 0:04:04.560 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:10:30 +0000 (0:00:00.041) 0:04:04.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:10:31 +0000 (0:00:00.689) 0:04:05.292 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:10:31 +0000 (0:00:00.420) 0:04:05.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:10:32 +0000 (0:00:00.673) 0:04:06.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103413.6461215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0abd6c3b575420382c580697f1d39e5f08cdaaf4", "ctime": 1654103411.4221215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792398, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103411.4221215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3853900254", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:10:32 +0000 (0:00:00.437) 0:04:06.823 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:10:32 +0000 (0:00:00.030) 0:04:06.854 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:445 Wednesday 01 June 2022 17:10:33 +0000 (0:00:00.897) 0:04:07.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:451 Wednesday 01 June 2022 17:10:33 +0000 (0:00:00.040) 0:04:07.791 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:10:33 +0000 (0:00:00.067) 0:04:07.858 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:10:34 +0000 (0:00:00.042) 0:04:07.901 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:10:34 +0000 (0:00:00.030) 0:04:07.932 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" }, "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "size": "4G", "type": "crypt", "uuid": "870f7bcb-d5d5-4d84-a1cb-85fd4821dc7b" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:10:34 +0000 (0:00:00.393) 0:04:08.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003131", "end": "2022-06-01 13:10:34.199168", "rc": 0, "start": "2022-06-01 13:10:34.196037" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:10:34 +0000 (0:00:00.389) 0:04:08.715 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003384", "end": "2022-06-01 13:10:34.595930", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:10:34.592546" } STDOUT: luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.395) 0:04:09.110 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.070) 0:04:09.180 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.034) 0:04:09.215 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.063) 0:04:09.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.040) 0:04:09.319 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.377) 0:04:09.696 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.045) 0:04:09.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.039) 0:04:09.782 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.037) 0:04:09.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:10:35 +0000 (0:00:00.037) 0:04:09.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.032) 0:04:09.889 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.042) 0:04:09.932 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.060) 0:04:09.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.080) 0:04:10.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.031) 0:04:10.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.029) 0:04:10.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.028) 0:04:10.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.029) 0:04:10.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.028) 0:04:10.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.045) 0:04:10.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.038) 0:04:10.306 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.065) 0:04:10.371 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.069) 0:04:10.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.033) 0:04:10.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.034) 0:04:10.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.037) 0:04:10.546 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.068) 0:04:10.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.038) 0:04:10.653 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.037) 0:04:10.691 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.059) 0:04:10.751 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.040) 0:04:10.791 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.041) 0:04:10.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:10:36 +0000 (0:00:00.038) 0:04:10.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.033) 0:04:10.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.033) 0:04:10.938 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.034) 0:04:10.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.036) 0:04:11.009 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.066) 0:04:11.075 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.068) 0:04:11.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.032) 0:04:11.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.033) 0:04:11.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.032) 0:04:11.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.034) 0:04:11.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.034) 0:04:11.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.033) 0:04:11.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.034) 0:04:11.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.034) 0:04:11.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.034) 0:04:11.448 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.068) 0:04:11.516 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.039) 0:04:11.556 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.136) 0:04:11.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.039) 0:04:11.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "870f7bcb-d5d5-4d84-a1cb-85fd4821dc7b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1029949, "block_size": 4096, "block_total": 1045504, "block_used": 15555, "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4218671104, "size_total": 4282384384, "uuid": "870f7bcb-d5d5-4d84-a1cb-85fd4821dc7b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.055) 0:04:11.787 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:10:37 +0000 (0:00:00.044) 0:04:11.831 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.040) 0:04:11.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.043) 0:04:11.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.030) 0:04:11.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.032) 0:04:11.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.035) 0:04:12.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.032) 0:04:12.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.108) 0:04:12.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.036) 0:04:12.192 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.037) 0:04:12.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.032) 0:04:12.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.032) 0:04:12.293 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.038) 0:04:12.332 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.041) 0:04:12.373 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103418.3831215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103407.7241216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15137, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103407.7241216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.393) 0:04:12.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.038) 0:04:12.806 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:10:38 +0000 (0:00:00.039) 0:04:12.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:10:39 +0000 (0:00:00.034) 0:04:12.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:10:39 +0000 (0:00:00.031) 0:04:12.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:10:39 +0000 (0:00:00.035) 0:04:12.946 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103429.1481216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103407.8871214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15176, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103407.8871214, "nlink": 1, "path": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:10:39 +0000 (0:00:00.423) 0:04:13.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.012927", "end": "2022-06-01 13:10:39.250787", "rc": 0, "start": "2022-06-01 13:10:39.237860" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 80 aa ef a0 8f bd 71 01 11 e9 7c 95 b6 4a 4a 48 7f de 43 47 MK salt: 04 aa cf 60 a6 a2 e4 4e 4a fe 05 dc c8 09 cd 25 8e df 7e c5 92 34 fa 14 5a 37 8c 1a 24 ce a7 de MK iterations: 95812 UUID: 12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 Key Slot 0: ENABLED Iterations: 1572076 Salt: 75 34 de 46 23 6e 06 74 5e fe 9e 12 31 ff ec 18 ce c0 db 77 0d e5 fc e3 9f 0b d3 89 86 94 9f 9d Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:10:39 +0000 (0:00:00.399) 0:04:13.769 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:10:39 +0000 (0:00:00.041) 0:04:13.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:10:39 +0000 (0:00:00.040) 0:04:13.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.039) 0:04:13.891 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.039) 0:04:13.931 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.038) 0:04:13.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.034) 0:04:14.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.031) 0:04:14.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.038) 0:04:14.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.036) 0:04:14.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.038) 0:04:14.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.040) 0:04:14.190 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.041) 0:04:14.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.032) 0:04:14.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.032) 0:04:14.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.032) 0:04:14.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.032) 0:04:14.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.032) 0:04:14.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.036) 0:04:14.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.034) 0:04:14.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:10:40 +0000 (0:00:00.033) 0:04:14.499 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.377) 0:04:14.877 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.387) 0:04:15.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.041) 0:04:15.306 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.037) 0:04:15.344 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.033) 0:04:15.377 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.035) 0:04:15.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.044) 0:04:15.457 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.036) 0:04:15.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.034) 0:04:15.528 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.039) 0:04:15.567 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.038) 0:04:15.605 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:10:41 +0000 (0:00:00.044) 0:04:15.650 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.032719", "end": "2022-06-01 13:10:41.556552", "rc": 0, "start": "2022-06-01 13:10:41.523833" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.418) 0:04:16.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.042) 0:04:16.112 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.042) 0:04:16.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.036) 0:04:16.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.037) 0:04:16.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.033) 0:04:16.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.034) 0:04:16.296 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.034) 0:04:16.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.032) 0:04:16.364 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.030) 0:04:16.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.037) 0:04:16.432 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:457 Wednesday 01 June 2022 17:10:42 +0000 (0:00:00.394) 0:04:16.827 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.053) 0:04:16.880 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.047) 0:04:16.927 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.510) 0:04:17.437 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.074) 0:04:17.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.030) 0:04:17.543 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.033) 0:04:17.576 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.063) 0:04:17.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:10:43 +0000 (0:00:00.026) 0:04:17.666 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:10:44 +0000 (0:00:00.884) 0:04:18.550 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:10:44 +0000 (0:00:00.039) 0:04:18.590 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:10:44 +0000 (0:00:00.034) 0:04:18.625 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:10:46 +0000 (0:00:01.280) 0:04:19.906 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:10:46 +0000 (0:00:00.057) 0:04:19.964 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:10:46 +0000 (0:00:00.028) 0:04:19.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:10:46 +0000 (0:00:00.031) 0:04:20.024 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:10:46 +0000 (0:00:00.029) 0:04:20.053 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:10:47 +0000 (0:00:00.826) 0:04:20.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service": { "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:10:48 +0000 (0:00:01.764) 0:04:22.645 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:10:48 +0000 (0:00:00.049) 0:04:22.694 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d12aa7b2f\x2d13bd\x2d4816\x2d86a7\x2db8cb1f1bbad2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket systemd-journald.socket \"dev-mapper-foo\\\\x2dtest1.device\" \"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.target\" umount.target cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:10:29 EDT", "StateChangeTimestampMonotonic": "3018914815", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:10:49 +0000 (0:00:00.681) 0:04:23.375 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:10:50 +0000 (0:00:01.238) 0:04:24.614 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2' in safe mode due to encryption removal"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:10:50 +0000 (0:00:00.044) 0:04:24.658 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d12aa7b2f\x2d13bd\x2d4816\x2d86a7\x2db8cb1f1bbad2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:10:29 EDT", "StateChangeTimestampMonotonic": "3018914815", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:477 Wednesday 01 June 2022 17:10:51 +0000 (0:00:00.730) 0:04:25.389 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:483 Wednesday 01 June 2022 17:10:51 +0000 (0:00:00.038) 0:04:25.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:10:51 +0000 (0:00:00.036) 0:04:25.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103442.3101215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103442.3101215, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103442.3101215, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "4161050601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:10:51 +0000 (0:00:00.397) 0:04:25.861 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:494 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.038) 0:04:25.900 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.053) 0:04:25.953 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.048) 0:04:26.001 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.523) 0:04:26.525 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.080) 0:04:26.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.033) 0:04:26.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.034) 0:04:26.673 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.068) 0:04:26.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:10:52 +0000 (0:00:00.030) 0:04:26.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:10:53 +0000 (0:00:00.859) 0:04:27.633 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:10:53 +0000 (0:00:00.040) 0:04:27.673 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:10:53 +0000 (0:00:00.035) 0:04:27.709 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:10:55 +0000 (0:00:01.277) 0:04:28.987 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:10:55 +0000 (0:00:00.070) 0:04:29.057 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:10:55 +0000 (0:00:00.033) 0:04:29.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:10:55 +0000 (0:00:00.032) 0:04:29.123 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:10:55 +0000 (0:00:00.037) 0:04:29.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:10:56 +0000 (0:00:00.849) 0:04:30.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service": { "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:10:57 +0000 (0:00:01.818) 0:04:31.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:10:58 +0000 (0:00:00.056) 0:04:31.885 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d12aa7b2f\x2d13bd\x2d4816\x2d86a7\x2db8cb1f1bbad2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-udevd-kernel.socket cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" \"dev-mapper-foo\\\\x2dtest1.device\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.target\" cryptsetup.target umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:10:29 EDT", "StateChangeTimestampMonotonic": "3018914815", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:10:58 +0000 (0:00:00.792) 0:04:32.678 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:11:00 +0000 (0:00:02.066) 0:04:34.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:11:00 +0000 (0:00:00.034) 0:04:34.779 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d12aa7b2f\x2d13bd\x2d4816\x2d86a7\x2db8cb1f1bbad2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:10:29 EDT", "StateChangeTimestampMonotonic": "3018914815", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:11:01 +0000 (0:00:00.695) 0:04:35.475 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:11:01 +0000 (0:00:00.058) 0:04:35.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:11:01 +0000 (0:00:00.042) 0:04:35.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:11:01 +0000 (0:00:00.035) 0:04:35.612 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:11:02 +0000 (0:00:00.400) 0:04:36.012 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:11:02 +0000 (0:00:00.693) 0:04:36.706 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:11:03 +0000 (0:00:00.439) 0:04:37.145 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:11:03 +0000 (0:00:00.682) 0:04:37.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103413.6461215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0abd6c3b575420382c580697f1d39e5f08cdaaf4", "ctime": 1654103411.4221215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792398, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103411.4221215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3853900254", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:11:04 +0000 (0:00:00.389) 0:04:38.217 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:11:04 +0000 (0:00:00.397) 0:04:38.614 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:510 Wednesday 01 June 2022 17:11:05 +0000 (0:00:00.858) 0:04:39.472 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:11:05 +0000 (0:00:00.062) 0:04:39.534 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:11:05 +0000 (0:00:00.043) 0:04:39.577 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:11:05 +0000 (0:00:00.033) 0:04:39.611 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f88a64e1-fc8e-498a-ab14-59ca29a52b6b" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:11:06 +0000 (0:00:00.383) 0:04:39.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003133", "end": "2022-06-01 13:11:05.868733", "rc": 0, "start": "2022-06-01 13:11:05.865600" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:11:06 +0000 (0:00:00.390) 0:04:40.385 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003002", "end": "2022-06-01 13:11:06.279632", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:11:06.276630" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:11:06 +0000 (0:00:00.411) 0:04:40.797 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:11:06 +0000 (0:00:00.069) 0:04:40.866 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.032) 0:04:40.899 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.064) 0:04:40.963 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.047) 0:04:41.010 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.400) 0:04:41.411 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.045) 0:04:41.456 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.042) 0:04:41.498 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.039) 0:04:41.537 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.039) 0:04:41.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.032) 0:04:41.610 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.045) 0:04:41.656 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.059) 0:04:41.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.036) 0:04:41.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.034) 0:04:41.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.035) 0:04:41.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:11:07 +0000 (0:00:00.032) 0:04:41.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.031) 0:04:41.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.031) 0:04:41.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.030) 0:04:41.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.030) 0:04:41.978 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.061) 0:04:42.039 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.061) 0:04:42.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.033) 0:04:42.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.030) 0:04:42.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.031) 0:04:42.197 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.064) 0:04:42.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.036) 0:04:42.298 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.038) 0:04:42.337 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.056) 0:04:42.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.035) 0:04:42.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.035) 0:04:42.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.031) 0:04:42.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.034) 0:04:42.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.032) 0:04:42.563 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.034) 0:04:42.598 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.032) 0:04:42.630 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.066) 0:04:42.697 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.065) 0:04:42.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.030) 0:04:42.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:11:08 +0000 (0:00:00.048) 0:04:42.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.093) 0:04:42.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.032) 0:04:42.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.030) 0:04:42.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.030) 0:04:43.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.030) 0:04:43.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.031) 0:04:43.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.033) 0:04:43.124 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.060) 0:04:43.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.036) 0:04:43.221 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.125) 0:04:43.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.038) 0:04:43.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "f88a64e1-fc8e-498a-ab14-59ca29a52b6b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "f88a64e1-fc8e-498a-ab14-59ca29a52b6b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.045) 0:04:43.431 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.038) 0:04:43.469 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.036) 0:04:43.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.038) 0:04:43.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.030) 0:04:43.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.031) 0:04:43.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.030) 0:04:43.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.032) 0:04:43.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.049) 0:04:43.718 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.036) 0:04:43.755 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.040) 0:04:43.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.032) 0:04:43.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:11:09 +0000 (0:00:00.033) 0:04:43.860 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.039) 0:04:43.900 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.037) 0:04:43.938 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103460.1481216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103460.1481216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15412, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103460.1481216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.389) 0:04:44.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.039) 0:04:44.367 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.039) 0:04:44.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.036) 0:04:44.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.032) 0:04:44.475 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.037) 0:04:44.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.032) 0:04:44.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.032) 0:04:44.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.033) 0:04:44.611 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.037) 0:04:44.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.030) 0:04:44.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.030) 0:04:44.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.031) 0:04:44.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.031) 0:04:44.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.034) 0:04:44.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:11:10 +0000 (0:00:00.039) 0:04:44.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.037) 0:04:44.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.031) 0:04:44.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.029) 0:04:44.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.030) 0:04:44.977 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.032) 0:04:45.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.030) 0:04:45.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.030) 0:04:45.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.030) 0:04:45.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.031) 0:04:45.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.032) 0:04:45.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.034) 0:04:45.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.080) 0:04:45.279 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:11:11 +0000 (0:00:00.391) 0:04:45.671 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.371) 0:04:46.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.039) 0:04:46.082 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.035) 0:04:46.118 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.032) 0:04:46.150 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.030) 0:04:46.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.031) 0:04:46.213 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.035) 0:04:46.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.032) 0:04:46.281 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.037) 0:04:46.318 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.035) 0:04:46.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.041) 0:04:46.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.041689", "end": "2022-06-01 13:11:12.325877", "rc": 0, "start": "2022-06-01 13:11:12.284188" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:11:12 +0000 (0:00:00.445) 0:04:46.840 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.040) 0:04:46.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.042) 0:04:46.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.034) 0:04:46.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.034) 0:04:46.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.033) 0:04:47.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.032) 0:04:47.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.031) 0:04:47.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.032) 0:04:47.122 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.028) 0:04:47.150 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/create-test-file.yml:10 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.032) 0:04:47.183 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:516 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.385) 0:04:47.569 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.049) 0:04:47.619 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:11:13 +0000 (0:00:00.046) 0:04:47.666 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:11:14 +0000 (0:00:00.519) 0:04:48.185 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:11:14 +0000 (0:00:00.075) 0:04:48.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:11:14 +0000 (0:00:00.037) 0:04:48.297 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:11:14 +0000 (0:00:00.034) 0:04:48.332 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:11:14 +0000 (0:00:00.066) 0:04:48.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:11:14 +0000 (0:00:00.029) 0:04:48.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:11:15 +0000 (0:00:00.916) 0:04:49.345 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:11:15 +0000 (0:00:00.041) 0:04:49.386 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:11:15 +0000 (0:00:00.034) 0:04:49.420 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:11:16 +0000 (0:00:01.244) 0:04:50.665 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:11:16 +0000 (0:00:00.059) 0:04:50.724 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:11:16 +0000 (0:00:00.028) 0:04:50.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:11:16 +0000 (0:00:00.038) 0:04:50.792 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:11:17 +0000 (0:00:00.090) 0:04:50.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:11:17 +0000 (0:00:00.913) 0:04:51.795 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service": { "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:11:19 +0000 (0:00:01.705) 0:04:53.501 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:11:19 +0000 (0:00:00.048) 0:04:53.549 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d12aa7b2f\x2d13bd\x2d4816\x2d86a7\x2db8cb1f1bbad2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket \"dev-mapper-foo\\\\x2dtest1.device\" cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.target\" umount.target cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-12aa7b2f-13bd-4816-86a7-b8cb1f1bbad2 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:10:29 EDT", "StateChangeTimestampMonotonic": "3018914815", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:11:20 +0000 (0:00:00.726) 0:04:54.276 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:11:21 +0000 (0:00:01.316) 0:04:55.592 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'yabbadabbadoo', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:11:21 +0000 (0:00:00.048) 0:04:55.640 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d12aa7b2f\x2d13bd\x2d4816\x2d86a7\x2db8cb1f1bbad2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "name": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d12aa7b2f\\x2d13bd\\x2d4816\\x2d86a7\\x2db8cb1f1bbad2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d12aa7b2f\\\\x2d13bd\\\\x2d4816\\\\x2d86a7\\\\x2db8cb1f1bbad2.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:536 Wednesday 01 June 2022 17:11:22 +0000 (0:00:00.707) 0:04:56.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:542 Wednesday 01 June 2022 17:11:22 +0000 (0:00:00.038) 0:04:56.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:10 Wednesday 01 June 2022 17:11:22 +0000 (0:00:00.041) 0:04:56.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103473.0551214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103473.0551214, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1654103473.0551214, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1219568964", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmp7247_7fr/tests/verify-data-preservation.yml:15 Wednesday 01 June 2022 17:11:22 +0000 (0:00:00.412) 0:04:56.840 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:553 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.038) 0:04:56.878 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.051) 0:04:56.930 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.048) 0:04:56.979 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.518) 0:04:57.497 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.074) 0:04:57.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.031) 0:04:57.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.035) 0:04:57.640 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.063) 0:04:57.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:11:23 +0000 (0:00:00.028) 0:04:57.731 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:11:24 +0000 (0:00:00.880) 0:04:58.611 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:11:24 +0000 (0:00:00.042) 0:04:58.654 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:11:24 +0000 (0:00:00.036) 0:04:58.690 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:11:26 +0000 (0:00:01.233) 0:04:59.924 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:11:26 +0000 (0:00:00.058) 0:04:59.982 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:11:26 +0000 (0:00:00.030) 0:05:00.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:11:26 +0000 (0:00:00.036) 0:05:00.049 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:11:26 +0000 (0:00:00.029) 0:05:00.078 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:11:27 +0000 (0:00:00.935) 0:05:01.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:11:28 +0000 (0:00:01.702) 0:05:02.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:11:28 +0000 (0:00:00.049) 0:05:02.766 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:11:28 +0000 (0:00:00.029) 0:05:02.796 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:11:44 +0000 (0:00:15.866) 0:05:18.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:11:44 +0000 (0:00:00.031) 0:05:18.694 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:11:44 +0000 (0:00:00.031) 0:05:18.725 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:11:44 +0000 (0:00:00.043) 0:05:18.768 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:11:44 +0000 (0:00:00.039) 0:05:18.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:11:44 +0000 (0:00:00.034) 0:05:18.842 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:11:45 +0000 (0:00:00.419) 0:05:19.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:11:46 +0000 (0:00:00.674) 0:05:19.935 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:11:46 +0000 (0:00:00.410) 0:05:20.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:11:47 +0000 (0:00:00.673) 0:05:21.019 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103466.2791214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103464.0921216, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 21708, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103464.0911214, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3753708411", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:11:47 +0000 (0:00:00.378) 0:05:21.397 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-4fd3f766-b904-44e6-bd1b-02e581212b5c', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:11:47 +0000 (0:00:00.413) 0:05:21.810 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:569 Wednesday 01 June 2022 17:11:48 +0000 (0:00:00.860) 0:05:22.670 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:11:48 +0000 (0:00:00.049) 0:05:22.720 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:11:48 +0000 (0:00:00.042) 0:05:22.762 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:11:48 +0000 (0:00:00.080) 0:05:22.843 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "4fd3f766-b904-44e6-bd1b-02e581212b5c" }, "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "size": "4G", "type": "crypt", "uuid": "897f5026-b339-4894-95d4-7c7b6f3357dc" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:11:49 +0000 (0:00:00.391) 0:05:23.234 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002743", "end": "2022-06-01 13:11:49.105880", "rc": 0, "start": "2022-06-01 13:11:49.103137" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:11:49 +0000 (0:00:00.389) 0:05:23.624 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003437", "end": "2022-06-01 13:11:49.495193", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:11:49.491756" } STDOUT: luks-4fd3f766-b904-44e6-bd1b-02e581212b5c /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.393) 0:05:24.017 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.065) 0:05:24.083 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.033) 0:05:24.116 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.066) 0:05:24.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.041) 0:05:24.224 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.387) 0:05:24.612 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.048) 0:05:24.661 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.042) 0:05:24.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.037) 0:05:24.741 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.035) 0:05:24.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.033) 0:05:24.809 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:11:50 +0000 (0:00:00.041) 0:05:24.851 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.056) 0:05:24.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.030) 0:05:24.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.033) 0:05:24.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.031) 0:05:25.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.030) 0:05:25.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.029) 0:05:25.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.032) 0:05:25.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.032) 0:05:25.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.034) 0:05:25.162 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.060) 0:05:25.222 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.065) 0:05:25.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.033) 0:05:25.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.032) 0:05:25.354 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.032) 0:05:25.387 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.062) 0:05:25.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.039) 0:05:25.489 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.039) 0:05:25.528 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.062) 0:05:25.591 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.038) 0:05:25.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.037) 0:05:25.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.103) 0:05:25.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.033) 0:05:25.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:11:51 +0000 (0:00:00.034) 0:05:25.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.034) 0:05:25.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.033) 0:05:25.906 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.066) 0:05:25.973 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.064) 0:05:26.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.031) 0:05:26.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.034) 0:05:26.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.031) 0:05:26.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.031) 0:05:26.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.032) 0:05:26.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.030) 0:05:26.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.030) 0:05:26.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.034) 0:05:26.294 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.030) 0:05:26.324 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.063) 0:05:26.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.041) 0:05:26.429 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.135) 0:05:26.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.036) 0:05:26.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1026386, "block_size": 4096, "block_total": 1041920, "block_used": 15534, "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fstype": "xfs", "inode_available": 2088957, "inode_total": 2088960, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4204077056, "size_total": 4267704320, "uuid": "897f5026-b339-4894-95d4-7c7b6f3357dc" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1026386, "block_size": 4096, "block_total": 1041920, "block_used": 15534, "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fstype": "xfs", "inode_available": 2088957, "inode_total": 2088960, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4204077056, "size_total": 4267704320, "uuid": "897f5026-b339-4894-95d4-7c7b6f3357dc" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.042) 0:05:26.645 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.039) 0:05:26.684 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.036) 0:05:26.721 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.047) 0:05:26.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.036) 0:05:26.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.030) 0:05:26.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:11:52 +0000 (0:00:00.029) 0:05:26.864 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.034) 0:05:26.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.048) 0:05:26.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.035) 0:05:26.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.041) 0:05:27.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.031) 0:05:27.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.032) 0:05:27.090 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.037) 0:05:27.128 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.038) 0:05:27.166 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103503.9121215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103503.9121215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15412, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103503.9121215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.406) 0:05:27.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.037) 0:05:27.610 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.039) 0:05:27.649 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.035) 0:05:27.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.030) 0:05:27.715 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:11:53 +0000 (0:00:00.036) 0:05:27.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103504.0601215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103504.0601215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15578, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103504.0601215, "nlink": 1, "path": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.398) 0:05:28.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.012221", "end": "2022-06-01 13:11:54.024471", "rc": 0, "start": "2022-06-01 13:11:54.012250" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 4fd3f766-b904-44e6-bd1b-02e581212b5c Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 10 Memory: 906462 Threads: 4 Salt: d2 2e 86 33 5f 27 4d 17 2e c4 b6 ca 31 fd 3c 3f e7 cf a5 d8 87 f0 30 f7 c0 2e 30 37 1e 44 32 35 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 97814 Salt: 8e 32 99 ba 4e f9 f6 94 59 9e 19 d8 f3 b1 2f e1 8d 3b 9e c4 95 3c 31 fd 21 b2 ed 7d dc 4f 74 49 Digest: 4a 38 c7 90 dc 99 b7 98 8e 29 3a 08 0d a2 53 77 fd 42 7e 0a 00 56 c5 f8 f9 5a 4c d1 c4 18 e4 a4 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.389) 0:05:28.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.042) 0:05:28.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.040) 0:05:28.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.039) 0:05:28.661 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.039) 0:05:28.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.031) 0:05:28.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.030) 0:05:28.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.031) 0:05:28.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4fd3f766-b904-44e6-bd1b-02e581212b5c /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:11:54 +0000 (0:00:00.041) 0:05:28.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.036) 0:05:28.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.042) 0:05:28.915 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.040) 0:05:28.955 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.039) 0:05:28.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.032) 0:05:29.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.039) 0:05:29.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.033) 0:05:29.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.039) 0:05:29.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.034) 0:05:29.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.033) 0:05:29.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.031) 0:05:29.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.030) 0:05:29.270 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:11:55 +0000 (0:00:00.378) 0:05:29.649 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.375) 0:05:30.024 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.041) 0:05:30.066 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.035) 0:05:30.101 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.031) 0:05:30.133 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.032) 0:05:30.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.032) 0:05:30.198 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.033) 0:05:30.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.032) 0:05:30.263 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.036) 0:05:30.299 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.037) 0:05:30.336 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.040) 0:05:30.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.039409", "end": "2022-06-01 13:11:56.292645", "rc": 0, "start": "2022-06-01 13:11:56.253236" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.432) 0:05:30.809 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:11:56 +0000 (0:00:00.040) 0:05:30.850 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.041) 0:05:30.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.033) 0:05:30.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.035) 0:05:30.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.034) 0:05:30.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.034) 0:05:31.029 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.032) 0:05:31.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.031) 0:05:31.093 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.028) 0:05:31.121 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:571 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.034) 0:05:31.155 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.075) 0:05:31.231 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.050) 0:05:31.281 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:11:57 +0000 (0:00:00.557) 0:05:31.838 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:11:58 +0000 (0:00:00.075) 0:05:31.913 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:11:58 +0000 (0:00:00.035) 0:05:31.949 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:11:58 +0000 (0:00:00.032) 0:05:31.981 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:11:58 +0000 (0:00:00.065) 0:05:32.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:11:58 +0000 (0:00:00.027) 0:05:32.075 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:11:59 +0000 (0:00:00.901) 0:05:32.977 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:11:59 +0000 (0:00:00.036) 0:05:33.013 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:11:59 +0000 (0:00:00.038) 0:05:33.052 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:12:00 +0000 (0:00:01.350) 0:05:34.403 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:12:00 +0000 (0:00:00.059) 0:05:34.463 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:12:00 +0000 (0:00:00.030) 0:05:34.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:12:00 +0000 (0:00:00.034) 0:05:34.527 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:12:00 +0000 (0:00:00.032) 0:05:34.559 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:12:01 +0000 (0:00:00.853) 0:05:35.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:12:03 +0000 (0:00:01.741) 0:05:37.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:12:03 +0000 (0:00:00.049) 0:05:37.203 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:12:03 +0000 (0:00:00.029) 0:05:37.233 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:12:05 +0000 (0:00:02.086) 0:05:39.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:12:05 +0000 (0:00:00.034) 0:05:39.354 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:12:05 +0000 (0:00:00.027) 0:05:39.381 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:12:05 +0000 (0:00:00.043) 0:05:39.424 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:12:05 +0000 (0:00:00.034) 0:05:39.459 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:12:05 +0000 (0:00:00.038) 0:05:39.498 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4fd3f766-b904-44e6-bd1b-02e581212b5c" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:12:06 +0000 (0:00:00.407) 0:05:39.905 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:12:06 +0000 (0:00:00.697) 0:05:40.603 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:12:06 +0000 (0:00:00.033) 0:05:40.637 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:12:07 +0000 (0:00:00.658) 0:05:41.295 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103509.4941216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "06683dcac853cdb5e5855ddb830292fe9fe5f0d7", "ctime": 1654103507.2771215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792398, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1654103507.2761216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3853900268", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:12:07 +0000 (0:00:00.409) 0:05:41.705 ******** changed: [/cache/rhel-x.qcow2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-4fd3f766-b904-44e6-bd1b-02e581212b5c', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:12:08 +0000 (0:00:00.400) 0:05:42.106 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_luks.yml:581 Wednesday 01 June 2022 17:12:09 +0000 (0:00:00.834) 0:05:42.940 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:12:09 +0000 (0:00:00.052) 0:05:42.993 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:12:09 +0000 (0:00:00.031) 0:05:43.025 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=xlDVMn-A71i-YFY9-S2rR-DBhN-K3RJ-OXamjD", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:12:09 +0000 (0:00:00.038) 0:05:43.064 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:12:09 +0000 (0:00:00.393) 0:05:43.458 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003059", "end": "2022-06-01 13:12:09.319268", "rc": 0, "start": "2022-06-01 13:12:09.316209" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:12:09 +0000 (0:00:00.378) 0:05:43.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002695", "end": "2022-06-01 13:12:09.701807", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:12:09.699112" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.381) 0:05:44.217 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.031) 0:05:44.249 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.032) 0:05:44.281 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.063) 0:05:44.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.036) 0:05:44.381 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.121) 0:05:44.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.037) 0:05:44.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.040) 0:05:44.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.030) 0:05:44.611 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.034) 0:05:44.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.033) 0:05:44.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.030) 0:05:44.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.031) 0:05:44.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.030) 0:05:44.771 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.030) 0:05:44.802 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:12:10 +0000 (0:00:00.044) 0:05:44.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.027) 0:05:44.875 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.035) 0:05:44.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.030) 0:05:44.941 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.030) 0:05:44.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.029) 0:05:45.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.024) 0:05:45.025 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103524.7021215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103524.7021215, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654103524.7021215, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.397) 0:05:45.423 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.038) 0:05:45.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.027) 0:05:45.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.032) 0:05:45.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.031) 0:05:45.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.026) 0:05:45.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.030) 0:05:45.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.035) 0:05:45.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.031) 0:05:45.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.027) 0:05:45.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.031) 0:05:45.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.030) 0:05:45.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.030) 0:05:45.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.034) 0:05:45.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:12:11 +0000 (0:00:00.030) 0:05:45.861 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.038) 0:05:45.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.036) 0:05:45.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.029) 0:05:45.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.030) 0:05:45.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.033) 0:05:46.029 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.085) 0:05:46.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.035) 0:05:46.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.033) 0:05:46.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.031) 0:05:46.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.032) 0:05:46.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.032) 0:05:46.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.034) 0:05:46.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.032) 0:05:46.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.031) 0:05:46.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.032) 0:05:46.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.037) 0:05:46.448 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.035) 0:05:46.483 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.032) 0:05:46.516 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.033) 0:05:46.550 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.033) 0:05:46.583 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.030) 0:05:46.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.029) 0:05:46.643 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.034) 0:05:46.678 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.040) 0:05:46.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.037) 0:05:46.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.035) 0:05:46.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.032) 0:05:46.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:12:12 +0000 (0:00:00.031) 0:05:46.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:12:13 +0000 (0:00:00.028) 0:05:46.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:12:13 +0000 (0:00:00.033) 0:05:46.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:12:13 +0000 (0:00:00.032) 0:05:46.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:12:13 +0000 (0:00:00.031) 0:05:46.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:12:13 +0000 (0:00:00.031) 0:05:47.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1165 changed=62 unreachable=0 failed=9 skipped=646 rescued=9 ignored=0 Wednesday 01 June 2022 17:12:13 +0000 (0:00:00.016) 0:05:47.031 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state -- 15.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state -- 11.63s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state -- 10.60s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 9.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 9.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.13s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.76s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:12:13 +0000 (0:00:00.022) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:12:15 +0000 (0:00:01.249) 0:00:01.272 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_auto_size_cap.yml ****************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:2 Wednesday 01 June 2022 17:12:15 +0000 (0:00:00.016) 0:00:01.289 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:7 Wednesday 01 June 2022 17:12:16 +0000 (0:00:01.037) 0:00:02.327 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:12:16 +0000 (0:00:00.037) 0:00:02.365 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:12:16 +0000 (0:00:00.149) 0:00:02.514 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:12:16 +0000 (0:00:00.535) 0:00:03.050 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:12:17 +0000 (0:00:00.083) 0:00:03.133 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:12:17 +0000 (0:00:00.022) 0:00:03.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:12:17 +0000 (0:00:00.022) 0:00:03.178 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:12:17 +0000 (0:00:00.184) 0:00:03.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:12:17 +0000 (0:00:00.018) 0:00:03.381 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:12:18 +0000 (0:00:01.068) 0:00:04.449 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:12:18 +0000 (0:00:00.046) 0:00:04.496 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:12:18 +0000 (0:00:00.043) 0:00:04.539 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:12:19 +0000 (0:00:00.692) 0:00:05.232 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:12:19 +0000 (0:00:00.080) 0:00:05.312 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:12:19 +0000 (0:00:00.020) 0:00:05.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:12:19 +0000 (0:00:00.022) 0:00:05.355 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:12:19 +0000 (0:00:00.019) 0:00:05.374 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:12:20 +0000 (0:00:00.832) 0:00:06.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service": { "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:12:21 +0000 (0:00:01.812) 0:00:08.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:12:21 +0000 (0:00:00.043) 0:00:08.063 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket \"system-systemd\\\\x2dcryptsetup.slice\" \"dev-mapper-foo\\\\x2dtest1.device\" systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.target\" cryptsetup.target umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-4fd3f766-b904-44e6-bd1b-02e581212b5c", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4fd3f766-b904-44e6-bd1b-02e581212b5c /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4fd3f766-b904-44e6-bd1b-02e581212b5c /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4fd3f766-b904-44e6-bd1b-02e581212b5c ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4fd3f766-b904-44e6-bd1b-02e581212b5c ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2022-06-01 13:12:06 EDT", "StateChangeTimestampMonotonic": "3115787035", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:12:22 +0000 (0:00:00.991) 0:00:09.055 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:12:23 +0000 (0:00:00.529) 0:00:09.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:12:23 +0000 (0:00:00.029) 0:00:09.614 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.682) 0:00:10.296 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.031) 0:00:10.328 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.033) 0:00:10.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.031) 0:00:10.393 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.028) 0:00:10.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.027) 0:00:10.449 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.029) 0:00:10.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.028) 0:00:10.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.461) 0:00:10.968 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:12:24 +0000 (0:00:00.027) 0:00:10.995 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:10 Wednesday 01 June 2022 17:12:25 +0000 (0:00:00.815) 0:00:11.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:17 Wednesday 01 June 2022 17:12:25 +0000 (0:00:00.030) 0:00:11.841 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:12:25 +0000 (0:00:00.043) 0:00:11.884 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:12:26 +0000 (0:00:00.515) 0:00:12.400 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:12:26 +0000 (0:00:00.034) 0:00:12.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:12:26 +0000 (0:00:00.029) 0:00:12.465 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [lsblk -b -l --noheadings -o NAME,SIZE] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:22 Wednesday 01 June 2022 17:12:26 +0000 (0:00:00.032) 0:00:12.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lsblk", "-b", "-l", "--noheadings", "-o", "NAME,SIZE" ], "delta": "0:00:00.004765", "end": "2022-06-01 13:12:26.296066", "rc": 0, "start": "2022-06-01 13:12:26.291301" } STDOUT: sda 10737418240 sdb 10737418240 sdc 10737418240 sr0 376832 vda 10737418240 vda1 1048576 vda2 209715200 vda3 524288000 vda4 10001300992 vdb 10737418240 vdc 10737418240 vdd 10737418240 TASK [Set test_disk_size] ****************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:27 Wednesday 01 June 2022 17:12:26 +0000 (0:00:00.538) 0:00:13.036 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "test_disk_size": "10737418240" }, "changed": false } TASK [Ensure bc is installed] ************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:33 Wednesday 01 June 2022 17:12:26 +0000 (0:00:00.033) 0:00:13.070 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "rc": 0, "results": [ "Installed: bc-1.07.1-14.el9.x86_64" ] } TASK [bc 10737418240 *2] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:38 Wednesday 01 June 2022 17:12:28 +0000 (0:00:01.226) 0:00:14.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "bc" ], "delta": "0:00:00.003077", "end": "2022-06-01 13:12:27.924533", "rc": 0, "start": "2022-06-01 13:12:27.921456" } STDOUT: 21474836480 TASK [Try to create a pool containing one volume twice the size of the backing disk] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:47 Wednesday 01 June 2022 17:12:28 +0000 (0:00:00.362) 0:00:14.660 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:12:28 +0000 (0:00:00.047) 0:00:14.708 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:12:28 +0000 (0:00:00.042) 0:00:14.750 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.536) 0:00:15.286 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.068) 0:00:15.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.028) 0:00:15.383 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.028) 0:00:15.412 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.059) 0:00:15.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.023) 0:00:15.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.027) 0:00:15.523 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "name": "test1", "size": "21474836480" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.034) 0:00:15.557 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.031) 0:00:15.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.028) 0:00:15.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.028) 0:00:15.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.028) 0:00:15.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.028) 0:00:15.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:12:29 +0000 (0:00:00.041) 0:00:15.744 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:12:30 +0000 (0:00:00.668) 0:00:16.413 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: specified size for volume '20 GiB' exceeds available space in pool 'foo' (10 GiB) TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:12:31 +0000 (0:00:01.042) 0:00:17.455 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'21474836480', u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"specified size for volume '20 GiB' exceeds available space in pool 'foo' (10 GiB)"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:12:31 +0000 (0:00:00.042) 0:00:17.497 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:62 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.678) 0:00:18.176 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a pool containing one volume the same size as the backing disk] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:70 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.038) 0:00:18.214 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.048) 0:00:18.263 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.044) 0:00:18.308 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.544) 0:00:18.853 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.069) 0:00:18.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.031) 0:00:18.954 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.041) 0:00:18.996 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:12:32 +0000 (0:00:00.096) 0:00:19.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.025) 0:00:19.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.030) 0:00:19.148 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "name": "test1", "size": "10737418240" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.035) 0:00:19.184 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.034) 0:00:19.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.028) 0:00:19.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.028) 0:00:19.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.028) 0:00:19.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.028) 0:00:19.333 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.041) 0:00:19.374 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:12:33 +0000 (0:00:00.688) 0:00:20.063 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:12:35 +0000 (0:00:01.736) 0:00:21.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:12:35 +0000 (0:00:00.027) 0:00:21.827 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.692) 0:00:22.520 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.040) 0:00:22.560 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.036) 0:00:22.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.034) 0:00:22.631 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.029) 0:00:22.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.035) 0:00:22.697 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.038) 0:00:22.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:12:36 +0000 (0:00:00.032) 0:00:22.768 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:12:37 +0000 (0:00:00.369) 0:00:23.137 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:12:37 +0000 (0:00:00.030) 0:00:23.167 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:81 Wednesday 01 June 2022 17:12:37 +0000 (0:00:00.867) 0:00:24.035 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:12:37 +0000 (0:00:00.048) 0:00:24.084 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:12:38 +0000 (0:00:00.038) 0:00:24.122 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:12:38 +0000 (0:00:00.030) 0:00:24.153 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "10G", "type": "lvm", "uuid": "c52070f9-264c-4310-a531-2eda1f951571" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "wcqHcc-Pduk-6zjG-jddL-CD32-IdS8-Z7Sedd" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:12:38 +0000 (0:00:00.514) 0:00:24.668 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003035", "end": "2022-06-01 13:12:38.305095", "rc": 0, "start": "2022-06-01 13:12:38.302060" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:12:38 +0000 (0:00:00.379) 0:00:25.047 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002511", "end": "2022-06-01 13:12:38.676890", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:12:38.674379" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:12:39 +0000 (0:00:00.395) 0:00:25.443 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:12:39 +0000 (0:00:00.062) 0:00:25.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:12:39 +0000 (0:00:00.029) 0:00:25.535 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:12:39 +0000 (0:00:00.062) 0:00:25.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:12:39 +0000 (0:00:00.037) 0:00:25.635 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:12:39 +0000 (0:00:00.455) 0:00:26.090 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.041) 0:00:26.132 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.037) 0:00:26.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.036) 0:00:26.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.035) 0:00:26.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.029) 0:00:26.270 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.040) 0:00:26.311 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.056) 0:00:26.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.028) 0:00:26.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.028) 0:00:26.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.028) 0:00:26.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.029) 0:00:26.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.028) 0:00:26.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.031) 0:00:26.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.029) 0:00:26.573 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.031) 0:00:26.605 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.057) 0:00:26.662 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.062) 0:00:26.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.029) 0:00:26.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.030) 0:00:26.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.029) 0:00:26.814 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.061) 0:00:26.875 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.034) 0:00:26.909 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.041) 0:00:26.951 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.061) 0:00:27.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.037) 0:00:27.050 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:12:40 +0000 (0:00:00.037) 0:00:27.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.029) 0:00:27.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.029) 0:00:27.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.030) 0:00:27.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.032) 0:00:27.211 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.029) 0:00:27.241 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.062) 0:00:27.303 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.063) 0:00:27.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.029) 0:00:27.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.028) 0:00:27.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.029) 0:00:27.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.029) 0:00:27.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.031) 0:00:27.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.074) 0:00:27.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.032) 0:00:27.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.032) 0:00:27.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.031) 0:00:27.686 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.061) 0:00:27.748 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.036) 0:00:27.784 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.130) 0:00:27.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.035) 0:00:27.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.039) 0:00:27.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.032) 0:00:28.022 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.035) 0:00:28.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:12:41 +0000 (0:00:00.029) 0:00:28.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.029) 0:00:28.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.029) 0:00:28.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.029) 0:00:28.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.033) 0:00:28.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.045) 0:00:28.255 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.032) 0:00:28.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.033) 0:00:28.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.030) 0:00:28.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.029) 0:00:28.381 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.036) 0:00:28.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.036) 0:00:28.454 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103554.9891214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103554.9891214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15780, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103554.9891214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.390) 0:00:28.844 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.035) 0:00:28.879 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.036) 0:00:28.916 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.034) 0:00:28.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.034) 0:00:28.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.040) 0:00:29.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.035) 0:00:29.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:12:42 +0000 (0:00:00.032) 0:00:29.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.032) 0:00:29.126 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.038) 0:00:29.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.030) 0:00:29.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.034) 0:00:29.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.029) 0:00:29.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.027) 0:00:29.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.027) 0:00:29.315 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.035) 0:00:29.350 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.031) 0:00:29.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.029) 0:00:29.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.028) 0:00:29.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.028) 0:00:29.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.029) 0:00:29.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.033) 0:00:29.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.032) 0:00:29.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.031) 0:00:29.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.032) 0:00:29.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.031) 0:00:29.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.031) 0:00:29.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:12:43 +0000 (0:00:00.030) 0:00:29.721 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.558) 0:00:30.280 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.380) 0:00:30.660 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "10737418240" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.038) 0:00:30.699 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.033) 0:00:30.733 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.030) 0:00:30.764 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.031) 0:00:30.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.030) 0:00:30.826 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.030) 0:00:30.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.030) 0:00:30.887 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.031) 0:00:30.919 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.029) 0:00:30.949 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:12:44 +0000 (0:00:00.035) 0:00:30.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.041031", "end": "2022-06-01 13:12:44.658735", "rc": 0, "start": "2022-06-01 13:12:44.617704" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.424) 0:00:31.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.038) 0:00:31.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.038) 0:00:31.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.033) 0:00:31.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.031) 0:00:31.550 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.032) 0:00:31.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.031) 0:00:31.613 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.028) 0:00:31.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.029) 0:00:31.672 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.028) 0:00:31.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:83 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.029) 0:00:31.730 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.056) 0:00:31.787 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:12:45 +0000 (0:00:00.044) 0:00:31.832 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.513) 0:00:32.345 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.070) 0:00:32.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.031) 0:00:32.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.032) 0:00:32.480 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.061) 0:00:32.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.026) 0:00:32.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.030) 0:00:32.599 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "name": "test1", "size": "10737418240" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.039) 0:00:32.638 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.034) 0:00:32.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.029) 0:00:32.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.034) 0:00:32.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.033) 0:00:32.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.030) 0:00:32.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:12:46 +0000 (0:00:00.085) 0:00:32.886 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:12:47 +0000 (0:00:00.708) 0:00:33.594 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:12:48 +0000 (0:00:01.352) 0:00:34.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:12:48 +0000 (0:00:00.030) 0:00:34.978 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.683) 0:00:35.661 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.039) 0:00:35.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.037) 0:00:35.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.034) 0:00:35.771 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.028) 0:00:35.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.031) 0:00:35.831 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.028) 0:00:35.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:12:49 +0000 (0:00:00.029) 0:00:35.889 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:12:50 +0000 (0:00:00.385) 0:00:36.275 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:12:50 +0000 (0:00:00.033) 0:00:36.308 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:95 Wednesday 01 June 2022 17:12:51 +0000 (0:00:00.874) 0:00:37.183 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:12:51 +0000 (0:00:00.056) 0:00:37.239 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:12:51 +0000 (0:00:00.039) 0:00:37.278 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:12:51 +0000 (0:00:00.030) 0:00:37.309 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "10G", "type": "lvm", "uuid": "c52070f9-264c-4310-a531-2eda1f951571" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "wcqHcc-Pduk-6zjG-jddL-CD32-IdS8-Z7Sedd" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:12:51 +0000 (0:00:00.398) 0:00:37.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002786", "end": "2022-06-01 13:12:51.346961", "rc": 0, "start": "2022-06-01 13:12:51.344175" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:12:51 +0000 (0:00:00.384) 0:00:38.091 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002763", "end": "2022-06-01 13:12:51.719443", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:12:51.716680" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:12:52 +0000 (0:00:00.368) 0:00:38.460 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:12:52 +0000 (0:00:00.110) 0:00:38.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:12:52 +0000 (0:00:00.031) 0:00:38.602 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:12:52 +0000 (0:00:00.071) 0:00:38.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:12:52 +0000 (0:00:00.043) 0:00:38.717 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.382) 0:00:39.100 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.042) 0:00:39.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.039) 0:00:39.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.037) 0:00:39.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.035) 0:00:39.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.030) 0:00:39.285 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.044) 0:00:39.329 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.056) 0:00:39.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.028) 0:00:39.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.027) 0:00:39.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.031) 0:00:39.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.031) 0:00:39.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.027) 0:00:39.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.027) 0:00:39.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.026) 0:00:39.587 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.027) 0:00:39.615 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.058) 0:00:39.674 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.058) 0:00:39.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.032) 0:00:39.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.038) 0:00:39.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.030) 0:00:39.835 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.061) 0:00:39.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.036) 0:00:39.932 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.039) 0:00:39.972 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.055) 0:00:40.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:12:53 +0000 (0:00:00.036) 0:00:40.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.035) 0:00:40.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.030) 0:00:40.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.032) 0:00:40.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.030) 0:00:40.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.031) 0:00:40.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.030) 0:00:40.254 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.065) 0:00:40.319 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.063) 0:00:40.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.041) 0:00:40.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.030) 0:00:40.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.032) 0:00:40.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.039) 0:00:40.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.029) 0:00:40.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.030) 0:00:40.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.039) 0:00:40.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.029) 0:00:40.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.033) 0:00:40.690 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.134) 0:00:40.825 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.037) 0:00:40.862 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.122) 0:00:40.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.034) 0:00:41.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.041) 0:00:41.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:12:54 +0000 (0:00:00.029) 0:00:41.090 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.034) 0:00:41.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.030) 0:00:41.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.029) 0:00:41.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.029) 0:00:41.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.031) 0:00:41.245 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.031) 0:00:41.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.047) 0:00:41.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.040) 0:00:41.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.035) 0:00:41.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.031) 0:00:41.432 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.033) 0:00:41.465 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.037) 0:00:41.503 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.035) 0:00:41.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103568.1591215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103554.9891214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15780, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103554.9891214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.384) 0:00:41.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.039) 0:00:41.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.041) 0:00:42.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.035) 0:00:42.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:12:55 +0000 (0:00:00.032) 0:00:42.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.037) 0:00:42.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.029) 0:00:42.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.030) 0:00:42.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.029) 0:00:42.199 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.036) 0:00:42.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.031) 0:00:42.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.029) 0:00:42.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.029) 0:00:42.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.031) 0:00:42.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.029) 0:00:42.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.036) 0:00:42.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.036) 0:00:42.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.029) 0:00:42.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.029) 0:00:42.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.028) 0:00:42.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.031) 0:00:42.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.031) 0:00:42.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.032) 0:00:42.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.031) 0:00:42.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.030) 0:00:42.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.036) 0:00:42.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.033) 0:00:42.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:12:56 +0000 (0:00:00.032) 0:00:42.809 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.367) 0:00:43.176 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.372) 0:00:43.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "10737418240" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.040) 0:00:43.588 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.035) 0:00:43.624 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.030) 0:00:43.655 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.030) 0:00:43.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.032) 0:00:43.718 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.031) 0:00:43.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.030) 0:00:43.780 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.033) 0:00:43.814 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.035) 0:00:43.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:12:57 +0000 (0:00:00.038) 0:00:43.887 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038966", "end": "2022-06-01 13:12:57.577577", "rc": 0, "start": "2022-06-01 13:12:57.538611" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.433) 0:00:44.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.038) 0:00:44.359 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.038) 0:00:44.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.030) 0:00:44.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.032) 0:00:44.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.031) 0:00:44.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.032) 0:00:44.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.030) 0:00:44.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.029) 0:00:44.585 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.027) 0:00:44.612 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:97 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.033) 0:00:44.645 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.063) 0:00:44.709 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:12:58 +0000 (0:00:00.043) 0:00:44.752 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.521) 0:00:45.273 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.071) 0:00:45.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.030) 0:00:45.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.030) 0:00:45.406 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.062) 0:00:45.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.026) 0:00:45.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.030) 0:00:45.525 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.035) 0:00:45.560 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.035) 0:00:45.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.030) 0:00:45.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.030) 0:00:45.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.030) 0:00:45.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.029) 0:00:45.717 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:12:59 +0000 (0:00:00.044) 0:00:45.761 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:13:00 +0000 (0:00:00.702) 0:00:46.464 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:13:02 +0000 (0:00:01.716) 0:00:48.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:13:02 +0000 (0:00:00.035) 0:00:48.216 ******** changed: [/cache/rhel-x.qcow2] => (item=systemd-cryptsetup@luks\x2d4fd3f766\x2db904\x2d44e6\x2dbd1b\x2d02e581212b5c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "name": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "232050688", "LimitMEMLOCKSoft": "232050688", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "6825", "LimitNPROCSoft": "6825", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6825", "LimitSIGPENDINGSoft": "6825", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4fd3f766\\x2db904\\x2d44e6\\x2dbd1b\\x2d02e581212b5c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4fd3f766\\\\x2db904\\\\x2d44e6\\\\x2dbd1b\\\\x2d02e581212b5c.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "10920", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:13:02 +0000 (0:00:00.695) 0:00:48.911 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:13:02 +0000 (0:00:00.038) 0:00:48.949 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:13:02 +0000 (0:00:00.045) 0:00:48.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:13:02 +0000 (0:00:00.036) 0:00:49.031 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:13:02 +0000 (0:00:00.028) 0:00:49.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:13:02 +0000 (0:00:00.030) 0:00:49.091 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:13:03 +0000 (0:00:00.029) 0:00:49.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:13:03 +0000 (0:00:00.030) 0:00:49.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:13:03 +0000 (0:00:00.369) 0:00:49.520 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:13:03 +0000 (0:00:00.029) 0:00:49.550 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=242 changed=13 unreachable=0 failed=1 skipped=169 rescued=1 ignored=0 Wednesday 01 June 2022 17:13:04 +0000 (0:00:00.799) 0:00:50.349 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.35s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.25s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Ensure bc is installed -------------------------------------------------- 1.23s /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:33 ------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.04s /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:2 -------------------------- linux-system-roles.storage : Mask the systemd cryptsetup services ------- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Mask the systemd cryptsetup services ------- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 linux-system-roles.storage : Mask the systemd cryptsetup services ------- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 linux-system-roles.storage : Unmask the systemd cryptsetup services ----- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Unmask the systemd cryptsetup services ----- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:13:05 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:13:06 +0000 (0:00:01.264) 0:00:01.287 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_auto_size_cap_nvme_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:13:06 +0000 (0:00:00.020) 0:00:01.308 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:13:07 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:13:08 +0000 (0:00:01.275) 0:00:01.298 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_auto_size_cap_scsi_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap_scsi_generated.yml:3 Wednesday 01 June 2022 17:13:08 +0000 (0:00:00.017) 0:00:01.315 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap_scsi_generated.yml:7 Wednesday 01 June 2022 17:13:09 +0000 (0:00:01.077) 0:00:02.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:2 Wednesday 01 June 2022 17:13:09 +0000 (0:00:00.026) 0:00:02.419 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:7 Wednesday 01 June 2022 17:13:10 +0000 (0:00:00.807) 0:00:03.227 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:13:10 +0000 (0:00:00.039) 0:00:03.266 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:13:10 +0000 (0:00:00.153) 0:00:03.419 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:13:10 +0000 (0:00:00.540) 0:00:03.960 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:13:11 +0000 (0:00:00.075) 0:00:04.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:13:11 +0000 (0:00:00.022) 0:00:04.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:13:11 +0000 (0:00:00.021) 0:00:04.080 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:13:11 +0000 (0:00:00.187) 0:00:04.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:13:11 +0000 (0:00:00.018) 0:00:04.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:13:12 +0000 (0:00:01.100) 0:00:05.386 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:13:12 +0000 (0:00:00.046) 0:00:05.432 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:13:12 +0000 (0:00:00.046) 0:00:05.479 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:13:13 +0000 (0:00:00.744) 0:00:06.224 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:13:13 +0000 (0:00:00.080) 0:00:06.304 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:13:13 +0000 (0:00:00.021) 0:00:06.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:13:13 +0000 (0:00:00.021) 0:00:06.346 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:13:13 +0000 (0:00:00.019) 0:00:06.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:13:14 +0000 (0:00:00.861) 0:00:07.228 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:13:16 +0000 (0:00:01.811) 0:00:09.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.072) 0:00:09.112 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.027) 0:00:09.140 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.512) 0:00:09.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.030) 0:00:09.682 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.029) 0:00:09.712 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.033) 0:00:09.745 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.032) 0:00:09.777 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.032) 0:00:09.810 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.027) 0:00:09.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.027) 0:00:09.866 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.026) 0:00:09.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:13:16 +0000 (0:00:00.029) 0:00:09.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:13:17 +0000 (0:00:00.460) 0:00:10.381 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:13:17 +0000 (0:00:00.027) 0:00:10.409 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:10 Wednesday 01 June 2022 17:13:18 +0000 (0:00:00.817) 0:00:11.226 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:17 Wednesday 01 June 2022 17:13:18 +0000 (0:00:00.029) 0:00:11.255 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:13:18 +0000 (0:00:00.045) 0:00:11.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:13:18 +0000 (0:00:00.525) 0:00:11.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:13:18 +0000 (0:00:00.035) 0:00:11.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:13:18 +0000 (0:00:00.029) 0:00:11.891 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [lsblk -b -l --noheadings -o NAME,SIZE] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:22 Wednesday 01 June 2022 17:13:18 +0000 (0:00:00.031) 0:00:11.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lsblk", "-b", "-l", "--noheadings", "-o", "NAME,SIZE" ], "delta": "0:00:00.004670", "end": "2022-06-01 13:13:18.768346", "rc": 0, "start": "2022-06-01 13:13:18.763676" } STDOUT: sda 10737418240 sdb 10737418240 sdc 10737418240 sr0 376832 vda 10737418240 vda1 1048576 vda2 209715200 vda3 524288000 vda4 10001300992 vdb 10737418240 vdc 10737418240 vdd 10737418240 TASK [Set test_disk_size] ****************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:27 Wednesday 01 June 2022 17:13:19 +0000 (0:00:00.484) 0:00:12.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "test_disk_size": "10737418240" }, "changed": false } TASK [Ensure bc is installed] ************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:33 Wednesday 01 June 2022 17:13:19 +0000 (0:00:00.034) 0:00:12.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [bc 10737418240 *2] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:38 Wednesday 01 June 2022 17:13:20 +0000 (0:00:00.836) 0:00:13.279 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "bc" ], "delta": "0:00:00.003196", "end": "2022-06-01 13:13:20.004753", "rc": 0, "start": "2022-06-01 13:13:20.001557" } STDOUT: 21474836480 TASK [Try to create a pool containing one volume twice the size of the backing disk] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:47 Wednesday 01 June 2022 17:13:20 +0000 (0:00:00.369) 0:00:13.649 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:13:20 +0000 (0:00:00.079) 0:00:13.729 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:13:20 +0000 (0:00:00.042) 0:00:13.771 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.544) 0:00:14.315 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.068) 0:00:14.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.030) 0:00:14.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.029) 0:00:14.444 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.062) 0:00:14.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.025) 0:00:14.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.029) 0:00:14.562 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "name": "test1", "size": "21474836480" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.034) 0:00:14.596 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.031) 0:00:14.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.028) 0:00:14.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.031) 0:00:14.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.027) 0:00:14.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.028) 0:00:14.745 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.042) 0:00:14.788 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:13:21 +0000 (0:00:00.026) 0:00:14.814 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: specified size for volume '20 GiB' exceeds available space in pool 'foo' (10 GiB) TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:13:22 +0000 (0:00:01.125) 0:00:15.940 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'21474836480', u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"specified size for volume '20 GiB' exceeds available space in pool 'foo' (10 GiB)"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:13:22 +0000 (0:00:00.043) 0:00:15.983 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:62 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.029) 0:00:16.012 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a pool containing one volume the same size as the backing disk] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:70 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.037) 0:00:16.050 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.049) 0:00:16.100 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.043) 0:00:16.143 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.499) 0:00:16.642 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.070) 0:00:16.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.030) 0:00:16.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.029) 0:00:16.773 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.061) 0:00:16.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.060) 0:00:16.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.031) 0:00:16.927 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "name": "test1", "size": "10737418240" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.035) 0:00:16.962 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:13:23 +0000 (0:00:00.030) 0:00:16.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:13:24 +0000 (0:00:00.029) 0:00:17.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:13:24 +0000 (0:00:00.028) 0:00:17.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:13:24 +0000 (0:00:00.029) 0:00:17.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:13:24 +0000 (0:00:00.028) 0:00:17.109 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:13:24 +0000 (0:00:00.042) 0:00:17.152 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:13:24 +0000 (0:00:00.025) 0:00:17.177 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:13:25 +0000 (0:00:01.744) 0:00:18.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:13:25 +0000 (0:00:00.028) 0:00:18.951 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:13:25 +0000 (0:00:00.027) 0:00:18.979 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.041) 0:00:19.021 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.036) 0:00:19.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.032) 0:00:19.090 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.028) 0:00:19.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.029) 0:00:19.148 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.028) 0:00:19.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.029) 0:00:19.206 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.379) 0:00:19.585 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:13:26 +0000 (0:00:00.026) 0:00:19.612 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:81 Wednesday 01 June 2022 17:13:27 +0000 (0:00:00.861) 0:00:20.474 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:13:27 +0000 (0:00:00.049) 0:00:20.523 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:13:27 +0000 (0:00:00.038) 0:00:20.561 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:13:27 +0000 (0:00:00.029) 0:00:20.590 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "10G", "type": "lvm", "uuid": "25964aea-e2ef-415a-a13c-f15676a633fb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Vn3aEm-0lcW-a35e-IW6g-1rfY-IS0d-j5pUmc" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:13:28 +0000 (0:00:00.538) 0:00:21.129 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003146", "end": "2022-06-01 13:13:27.859360", "rc": 0, "start": "2022-06-01 13:13:27.856214" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:13:28 +0000 (0:00:00.370) 0:00:21.500 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002516", "end": "2022-06-01 13:13:28.211588", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:13:28.209072" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:13:28 +0000 (0:00:00.352) 0:00:21.853 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:13:28 +0000 (0:00:00.060) 0:00:21.913 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:13:28 +0000 (0:00:00.069) 0:00:21.983 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.065) 0:00:22.049 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.038) 0:00:22.087 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.457) 0:00:22.545 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.041) 0:00:22.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.037) 0:00:22.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.034) 0:00:22.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.034) 0:00:22.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.030) 0:00:22.722 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.041) 0:00:22.763 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.059) 0:00:22.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.030) 0:00:22.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.030) 0:00:22.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.030) 0:00:22.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.032) 0:00:22.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:13:29 +0000 (0:00:00.030) 0:00:22.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.031) 0:00:23.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.031) 0:00:23.040 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.031) 0:00:23.072 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.060) 0:00:23.132 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.060) 0:00:23.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.029) 0:00:23.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.029) 0:00:23.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.031) 0:00:23.283 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.061) 0:00:23.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.033) 0:00:23.378 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.034) 0:00:23.412 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.060) 0:00:23.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.034) 0:00:23.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.035) 0:00:23.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.030) 0:00:23.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.028) 0:00:23.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.030) 0:00:23.632 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.029) 0:00:23.661 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.031) 0:00:23.693 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.062) 0:00:23.756 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.066) 0:00:23.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.030) 0:00:23.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.029) 0:00:23.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.029) 0:00:23.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.030) 0:00:23.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:13:30 +0000 (0:00:00.029) 0:00:23.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.028) 0:00:24.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.029) 0:00:24.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.030) 0:00:24.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.029) 0:00:24.091 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.091) 0:00:24.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.035) 0:00:24.218 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.126) 0:00:24.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.035) 0:00:24.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.040) 0:00:24.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.029) 0:00:24.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.036) 0:00:24.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.028) 0:00:24.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.029) 0:00:24.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.028) 0:00:24.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.031) 0:00:24.605 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.030) 0:00:24.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.044) 0:00:24.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.035) 0:00:24.715 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.035) 0:00:24.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.028) 0:00:24.780 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.033) 0:00:24.813 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.036) 0:00:24.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:13:31 +0000 (0:00:00.034) 0:00:24.884 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103605.2101214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103605.2101214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15977, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103605.2101214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.389) 0:00:25.273 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.037) 0:00:25.310 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.034) 0:00:25.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.032) 0:00:25.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.030) 0:00:25.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.040) 0:00:25.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.037) 0:00:25.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.029) 0:00:25.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.039) 0:00:25.557 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.039) 0:00:25.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.036) 0:00:25.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.032) 0:00:25.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.031) 0:00:25.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.029) 0:00:25.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.029) 0:00:25.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.036) 0:00:25.792 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.036) 0:00:25.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.031) 0:00:25.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.035) 0:00:25.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.030) 0:00:25.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.032) 0:00:25.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:13:32 +0000 (0:00:00.029) 0:00:25.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:13:33 +0000 (0:00:00.033) 0:00:26.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:13:33 +0000 (0:00:00.030) 0:00:26.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:13:33 +0000 (0:00:00.030) 0:00:26.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:13:33 +0000 (0:00:00.031) 0:00:26.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:13:33 +0000 (0:00:00.029) 0:00:26.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:13:33 +0000 (0:00:00.030) 0:00:26.174 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:13:33 +0000 (0:00:00.589) 0:00:26.764 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.382) 0:00:27.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "10737418240" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.037) 0:00:27.184 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.032) 0:00:27.216 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.029) 0:00:27.246 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.030) 0:00:27.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.032) 0:00:27.309 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.030) 0:00:27.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.030) 0:00:27.370 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.034) 0:00:27.404 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.032) 0:00:27.436 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.037) 0:00:27.473 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035232", "end": "2022-06-01 13:13:34.240639", "rc": 0, "start": "2022-06-01 13:13:34.205407" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.411) 0:00:27.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.038) 0:00:27.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.039) 0:00:27.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:13:34 +0000 (0:00:00.032) 0:00:27.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.031) 0:00:28.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.031) 0:00:28.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.032) 0:00:28.092 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.032) 0:00:28.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.030) 0:00:28.155 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.027) 0:00:28.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:83 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.030) 0:00:28.213 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.056) 0:00:28.270 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.044) 0:00:28.314 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.523) 0:00:28.838 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.071) 0:00:28.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.032) 0:00:28.941 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:13:35 +0000 (0:00:00.029) 0:00:28.971 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.059) 0:00:29.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.025) 0:00:29.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.029) 0:00:29.085 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "name": "test1", "size": "10737418240" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.034) 0:00:29.120 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.031) 0:00:29.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.028) 0:00:29.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.028) 0:00:29.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.030) 0:00:29.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.029) 0:00:29.269 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.043) 0:00:29.313 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:13:36 +0000 (0:00:00.026) 0:00:29.340 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:13:37 +0000 (0:00:01.346) 0:00:30.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.030) 0:00:30.717 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.027) 0:00:30.745 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.038) 0:00:30.783 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.036) 0:00:30.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.034) 0:00:30.854 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.036) 0:00:30.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.030) 0:00:30.921 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.029) 0:00:30.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:13:37 +0000 (0:00:00.032) 0:00:30.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:13:38 +0000 (0:00:00.381) 0:00:31.365 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:13:38 +0000 (0:00:00.029) 0:00:31.395 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:95 Wednesday 01 June 2022 17:13:39 +0000 (0:00:00.814) 0:00:32.210 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:13:39 +0000 (0:00:00.052) 0:00:32.262 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:13:39 +0000 (0:00:00.040) 0:00:32.303 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:13:39 +0000 (0:00:00.029) 0:00:32.333 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "10G", "type": "lvm", "uuid": "25964aea-e2ef-415a-a13c-f15676a633fb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Vn3aEm-0lcW-a35e-IW6g-1rfY-IS0d-j5pUmc" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:13:39 +0000 (0:00:00.385) 0:00:32.719 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.004872", "end": "2022-06-01 13:13:39.604900", "rc": 0, "start": "2022-06-01 13:13:39.600028" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:13:40 +0000 (0:00:00.551) 0:00:33.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003007", "end": "2022-06-01 13:13:40.017112", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:13:40.014105" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:13:40 +0000 (0:00:00.391) 0:00:33.662 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:13:40 +0000 (0:00:00.062) 0:00:33.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:13:40 +0000 (0:00:00.034) 0:00:33.758 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:13:40 +0000 (0:00:00.098) 0:00:33.857 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:13:40 +0000 (0:00:00.040) 0:00:33.897 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.372) 0:00:34.270 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.041) 0:00:34.311 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.036) 0:00:34.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.035) 0:00:34.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.034) 0:00:34.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.028) 0:00:34.447 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.041) 0:00:34.488 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.058) 0:00:34.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.029) 0:00:34.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.032) 0:00:34.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.761 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.791 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.056) 0:00:34.848 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.060) 0:00:34.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:13:41 +0000 (0:00:00.030) 0:00:34.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.029) 0:00:34.999 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.063) 0:00:35.063 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.035) 0:00:35.098 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.034) 0:00:35.132 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.056) 0:00:35.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.035) 0:00:35.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.035) 0:00:35.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.029) 0:00:35.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.028) 0:00:35.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.029) 0:00:35.346 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.030) 0:00:35.376 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.032) 0:00:35.408 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.062) 0:00:35.471 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.069) 0:00:35.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.033) 0:00:35.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.032) 0:00:35.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.029) 0:00:35.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.031) 0:00:35.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.030) 0:00:35.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.031) 0:00:35.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.029) 0:00:35.761 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.029) 0:00:35.790 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.032) 0:00:35.823 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.066) 0:00:35.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:13:42 +0000 (0:00:00.076) 0:00:35.966 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.122) 0:00:36.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.035) 0:00:36.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.038) 0:00:36.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.030) 0:00:36.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.034) 0:00:36.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.030) 0:00:36.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.027) 0:00:36.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.028) 0:00:36.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.028) 0:00:36.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.030) 0:00:36.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.043) 0:00:36.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.034) 0:00:36.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.034) 0:00:36.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.028) 0:00:36.514 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.029) 0:00:36.544 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.034) 0:00:36.578 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.034) 0:00:36.612 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103616.9851215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103605.2101214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 15977, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103605.2101214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:13:43 +0000 (0:00:00.375) 0:00:36.988 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.038) 0:00:37.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.034) 0:00:37.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.035) 0:00:37.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.035) 0:00:37.132 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.035) 0:00:37.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.029) 0:00:37.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.032) 0:00:37.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.029) 0:00:37.259 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.037) 0:00:37.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.030) 0:00:37.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.031) 0:00:37.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.029) 0:00:37.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.032) 0:00:37.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.030) 0:00:37.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.037) 0:00:37.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.035) 0:00:37.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.029) 0:00:37.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.029) 0:00:37.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.031) 0:00:37.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.030) 0:00:37.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.030) 0:00:37.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.030) 0:00:37.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.030) 0:00:37.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.031) 0:00:37.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.034) 0:00:37.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.029) 0:00:37.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:13:44 +0000 (0:00:00.035) 0:00:37.869 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.373) 0:00:38.243 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.418) 0:00:38.661 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "10737418240" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.038) 0:00:38.700 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.033) 0:00:38.734 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.030) 0:00:38.765 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.031) 0:00:38.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.033) 0:00:38.830 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.034) 0:00:38.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.031) 0:00:38.896 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.033) 0:00:38.930 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:13:45 +0000 (0:00:00.033) 0:00:38.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.037) 0:00:39.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.033269", "end": "2022-06-01 13:13:45.766430", "rc": 0, "start": "2022-06-01 13:13:45.733161" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.412) 0:00:39.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.038) 0:00:39.452 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.039) 0:00:39.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.032) 0:00:39.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.032) 0:00:39.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.032) 0:00:39.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.036) 0:00:39.625 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.032) 0:00:39.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.033) 0:00:39.690 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.027) 0:00:39.718 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:97 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.029) 0:00:39.747 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.066) 0:00:39.814 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:13:46 +0000 (0:00:00.044) 0:00:39.858 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.512) 0:00:40.371 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.068) 0:00:40.439 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.031) 0:00:40.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.030) 0:00:40.501 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.061) 0:00:40.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.028) 0:00:40.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.030) 0:00:40.622 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.036) 0:00:40.658 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.033) 0:00:40.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.029) 0:00:40.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.030) 0:00:40.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.031) 0:00:40.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.029) 0:00:40.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.044) 0:00:40.858 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:13:47 +0000 (0:00:00.032) 0:00:40.891 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:13:49 +0000 (0:00:01.684) 0:00:42.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.031) 0:00:42.607 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.026) 0:00:42.634 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.036) 0:00:42.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.035) 0:00:42.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.035) 0:00:42.741 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.034) 0:00:42.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.031) 0:00:42.807 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.028) 0:00:42.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:13:49 +0000 (0:00:00.030) 0:00:42.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:13:50 +0000 (0:00:00.397) 0:00:43.265 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:13:50 +0000 (0:00:00.029) 0:00:43.294 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=234 changed=2 unreachable=0 failed=1 skipped=179 rescued=1 ignored=0 Wednesday 01 June 2022 17:13:51 +0000 (0:00:00.824) 0:00:44.119 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.35s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.13s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap_scsi_generated.yml:3 ----------- linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Ensure bc is installed -------------------------------------------------- 0.84s /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:33 ------------------------- linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.81s /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml:2 -------------------------- linux-system-roles.storage : get required packages ---------------------- 0.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 parse the actual size of the volume ------------------------------------- 0.59s /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 -------------------------- Read the /etc/fstab file for volume existence --------------------------- 0.55s /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 ----------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:13:51 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:13:53 +0000 (0:00:01.302) 0:00:01.325 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_errors.yml ************************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:2 Wednesday 01 June 2022 17:13:53 +0000 (0:00:00.033) 0:00:01.358 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:18 Wednesday 01 June 2022 17:13:54 +0000 (0:00:01.091) 0:00:02.450 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:13:54 +0000 (0:00:00.037) 0:00:02.487 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:13:54 +0000 (0:00:00.153) 0:00:02.641 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:13:55 +0000 (0:00:00.544) 0:00:03.186 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:13:55 +0000 (0:00:00.075) 0:00:03.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:13:55 +0000 (0:00:00.022) 0:00:03.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:13:55 +0000 (0:00:00.021) 0:00:03.306 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:13:55 +0000 (0:00:00.197) 0:00:03.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:13:55 +0000 (0:00:00.018) 0:00:03.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:13:56 +0000 (0:00:01.086) 0:00:04.609 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:13:56 +0000 (0:00:00.046) 0:00:04.655 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:13:56 +0000 (0:00:00.045) 0:00:04.701 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:13:57 +0000 (0:00:00.680) 0:00:05.381 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:13:57 +0000 (0:00:00.079) 0:00:05.461 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:13:57 +0000 (0:00:00.021) 0:00:05.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:13:57 +0000 (0:00:00.022) 0:00:05.504 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:13:57 +0000 (0:00:00.019) 0:00:05.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:13:58 +0000 (0:00:00.900) 0:00:06.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:00 +0000 (0:00:01.790) 0:00:08.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.042) 0:00:08.258 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.026) 0:00:08.285 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.516) 0:00:08.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.056) 0:00:08.858 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.028) 0:00:08.886 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.033) 0:00:08.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.031) 0:00:08.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.031) 0:00:08.983 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.026) 0:00:09.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.029) 0:00:09.039 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.027) 0:00:09.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:14:00 +0000 (0:00:00.026) 0:00:09.093 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:14:01 +0000 (0:00:00.454) 0:00:09.548 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:14:01 +0000 (0:00:00.027) 0:00:09.575 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:21 Wednesday 01 June 2022 17:14:02 +0000 (0:00:00.839) 0:00:10.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:28 Wednesday 01 June 2022 17:14:02 +0000 (0:00:00.030) 0:00:10.445 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:14:02 +0000 (0:00:00.042) 0:00:10.488 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:14:02 +0000 (0:00:00.541) 0:00:11.030 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:14:02 +0000 (0:00:00.038) 0:00:11.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:14:02 +0000 (0:00:00.029) 0:00:11.098 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Try to create LVM with an invalid disk specification.] ******************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:35 Wednesday 01 June 2022 17:14:02 +0000 (0:00:00.032) 0:00:11.130 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.046) 0:00:11.177 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.042) 0:00:11.220 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.511) 0:00:11.731 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.066) 0:00:11.798 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.030) 0:00:11.829 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.028) 0:00:11.857 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.058) 0:00:11.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.024) 0:00:11.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.029) 0:00:11.970 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "/non/existent/disk" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.033) 0:00:12.004 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.063) 0:00:12.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.029) 0:00:12.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:03 +0000 (0:00:00.036) 0:00:12.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:04 +0000 (0:00:00.039) 0:00:12.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:04 +0000 (0:00:00.040) 0:00:12.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:04 +0000 (0:00:00.043) 0:00:12.258 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:04 +0000 (0:00:00.029) 0:00:12.288 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: unable to resolve any disks specified for pool 'foo' (['/non/existent/disk']) TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:05 +0000 (0:00:01.017) 0:00:13.305 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'/non/existent/disk'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'5g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"unable to resolve any disks specified for pool 'foo' (['/non/existent/disk'])"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.039) 0:00:13.345 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:52 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.027) 0:00:13.372 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM with an invalid size specification.] ******************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:73 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.033) 0:00:13.406 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.053) 0:00:13.459 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.042) 0:00:13.501 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.515) 0:00:14.017 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.070) 0:00:14.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.031) 0:00:14.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:05 +0000 (0:00:00.032) 0:00:14.151 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.064) 0:00:14.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.025) 0:00:14.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.030) 0:00:14.271 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "xyz GiB" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.035) 0:00:14.306 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.031) 0:00:14.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.029) 0:00:14.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.031) 0:00:14.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.032) 0:00:14.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.030) 0:00:14.462 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.043) 0:00:14.506 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:06 +0000 (0:00:00.026) 0:00:14.532 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: invalid size specification for volume 'test1': 'xyz GiB' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:07 +0000 (0:00:01.101) 0:00:15.634 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'xyz GiB', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"invalid size specification for volume 'test1': 'xyz GiB'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:07 +0000 (0:00:00.040) 0:00:15.675 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:90 Wednesday 01 June 2022 17:14:07 +0000 (0:00:00.030) 0:00:15.705 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM pool with no disks specified.] ************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:106 Wednesday 01 June 2022 17:14:07 +0000 (0:00:00.034) 0:00:15.740 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:07 +0000 (0:00:00.049) 0:00:15.789 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:07 +0000 (0:00:00.042) 0:00:15.832 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.512) 0:00:16.344 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.068) 0:00:16.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.031) 0:00:16.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.029) 0:00:16.474 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.059) 0:00:16.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.024) 0:00:16.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.029) 0:00:16.588 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.031) 0:00:16.619 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.034) 0:00:16.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.029) 0:00:16.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.029) 0:00:16.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.029) 0:00:16.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.031) 0:00:16.773 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.042) 0:00:16.815 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:08 +0000 (0:00:00.027) 0:00:16.842 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: no disks specified for pool 'foo' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:09 +0000 (0:00:00.993) 0:00:17.836 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'5g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"no disks specified for pool 'foo'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:09 +0000 (0:00:00.040) 0:00:17.877 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:123 Wednesday 01 June 2022 17:14:09 +0000 (0:00:00.026) 0:00:17.904 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM volume from outside of any pool.] ********************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:139 Wednesday 01 June 2022 17:14:09 +0000 (0:00:00.033) 0:00:17.937 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:09 +0000 (0:00:00.047) 0:00:17.985 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:09 +0000 (0:00:00.042) 0:00:18.027 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.507) 0:00:18.535 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.069) 0:00:18.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.031) 0:00:18.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.031) 0:00:18.667 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.097) 0:00:18.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.025) 0:00:18.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.028) 0:00:18.819 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.032) 0:00:18.851 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [], "mount_point": "/opt/test1", "name": "test1", "size": "5g", "type": "lvm" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.033) 0:00:18.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.031) 0:00:18.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.029) 0:00:18.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.028) 0:00:18.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.028) 0:00:19.002 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.042) 0:00:19.044 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:10 +0000 (0:00:00.026) 0:00:19.071 ******** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'NoneType' object has no attribute '_device' fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.9/runpy.py", line 210, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 549, in manage File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 357, in _look_up_device File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 676, in _get_device_id AttributeError: 'NoneType' object has no attribute '_device' MODULE_STDERR: Shared connection to 127.0.0.3 closed. TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:11 +0000 (0:00:01.036) 0:00:20.107 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'exception': u'Traceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 549, in manage\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 357, in _look_up_device\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 676, in _get_device_id\r\nAttributeError: \'NoneType\' object has no attribute \'_device\'\r\n', u'_ansible_no_log': False, u'module_stderr': u'Shared connection to 127.0.0.3 closed.\r\n', u'changed': False, u'module_stdout': u'Traceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654103650.97-97912-249004265088588/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 549, in manage\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 357, in _look_up_device\r\n File "/tmp/ansible_blivet_payload_yjtdqf3t/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 676, in _get_device_id\r\nAttributeError: \'NoneType\' object has no attribute \'_device\'\r\n', u'failed': True, u'rc': 1, u'msg': u'MODULE FAILURE\nSee stdout/stderr for the exact error'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:11 +0000 (0:00:00.035) 0:00:20.143 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:155 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.028) 0:00:20.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create two pools w/ the same name] ******************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:171 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.033) 0:00:20.204 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.048) 0:00:20.253 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.042) 0:00:20.296 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.525) 0:00:20.821 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.069) 0:00:20.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.030) 0:00:20.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.029) 0:00:20.949 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.062) 0:00:21.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.029) 0:00:21.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.034) 0:00:21.076 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "pool1", "type": "lvm" }, { "disks": [ "sda" ], "name": "pool1", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.035) 0:00:21.111 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:12 +0000 (0:00:00.031) 0:00:21.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:13 +0000 (0:00:00.030) 0:00:21.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:13 +0000 (0:00:00.033) 0:00:21.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:13 +0000 (0:00:00.030) 0:00:21.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:13 +0000 (0:00:00.030) 0:00:21.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:13 +0000 (0:00:00.044) 0:00:21.312 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:13 +0000 (0:00:00.028) 0:00:21.341 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: multiple pools with the same name: pool1 TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:14 +0000 (0:00:01.049) 0:00:22.391 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'name': u'pool1', u'encryption_password': None, u'raid_metadata_version': None, u'encryption': None, u'encryption_key_size': None, u'disks': [u'sda'], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, {u'name': u'pool1', u'encryption_password': None, u'raid_metadata_version': None, u'encryption': None, u'encryption_key_size': None, u'disks': [u'sda'], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u'multiple pools with the same name: pool1'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:14 +0000 (0:00:00.041) 0:00:22.432 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:188 Wednesday 01 June 2022 17:14:14 +0000 (0:00:00.026) 0:00:22.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the duplicate pools test] *************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:194 Wednesday 01 June 2022 17:14:14 +0000 (0:00:00.033) 0:00:22.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create two volumes w/ the same name] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:203 Wednesday 01 June 2022 17:14:14 +0000 (0:00:00.033) 0:00:22.525 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:14 +0000 (0:00:00.081) 0:00:22.606 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:14 +0000 (0:00:00.041) 0:00:22.647 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:14 +0000 (0:00:00.506) 0:00:23.154 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.069) 0:00:23.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.030) 0:00:23.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.028) 0:00:23.283 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.059) 0:00:23.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.026) 0:00:23.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.029) 0:00:23.399 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "pool1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g", "type": "lvm" }, { "mount_point": "/opt/test2", "name": "test1", "size": "2g", "type": "lvm" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.041) 0:00:23.440 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.033) 0:00:23.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.028) 0:00:23.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.028) 0:00:23.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.031) 0:00:23.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.030) 0:00:23.594 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.043) 0:00:23.637 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:15 +0000 (0:00:00.026) 0:00:23.664 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: multiple volumes in pool 'pool1' with the same name: test1 TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:16 +0000 (0:00:01.008) 0:00:24.672 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'name': u'pool1', u'encryption_password': None, u'raid_metadata_version': None, u'encryption': None, u'encryption_key_size': None, u'disks': [u'sda'], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'5g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': u'lvm', u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}, {u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'2g', u'cache_mode': None, u'mount_point': u'/opt/test2', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': u'lvm', u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'encryption_cipher': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"multiple volumes in pool 'pool1' with the same name: test1"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:16 +0000 (0:00:00.042) 0:00:24.714 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:226 Wednesday 01 June 2022 17:14:16 +0000 (0:00:00.027) 0:00:24.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the duplicate volumes test] ************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:232 Wednesday 01 June 2022 17:14:16 +0000 (0:00:00.031) 0:00:24.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a pool] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:239 Wednesday 01 June 2022 17:14:16 +0000 (0:00:00.033) 0:00:24.807 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:16 +0000 (0:00:00.047) 0:00:24.855 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:16 +0000 (0:00:00.042) 0:00:24.897 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.495) 0:00:25.393 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.069) 0:00:25.462 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.030) 0:00:25.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.031) 0:00:25.525 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.064) 0:00:25.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.024) 0:00:25.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.078) 0:00:25.692 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "testpool1", "type": "lvm", "volumes": [ { "fs_type": "ext4", "name": "testvol1", "size": "1g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.036) 0:00:25.728 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.033) 0:00:25.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.029) 0:00:25.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.031) 0:00:25.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.029) 0:00:25.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.031) 0:00:25.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.042) 0:00:25.926 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:17 +0000 (0:00:00.026) 0:00:25.953 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/testpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/testpool1-testvol1" ], "mounts": [], "packages": [ "xfsprogs", "e2fsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/testpool1-testvol1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/testpool1-testvol1", "_raw_device": "/dev/mapper/testpool1-testvol1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "testvol1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "1g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:14:19 +0000 (0:00:01.636) 0:00:27.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.030) 0:00:27.620 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.027) 0:00:27.647 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/testpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/testpool1-testvol1" ], "mounts": [], "packages": [ "xfsprogs", "e2fsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/testpool1-testvol1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/testpool1-testvol1", "_raw_device": "/dev/mapper/testpool1-testvol1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "testvol1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "1g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.038) 0:00:27.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/testpool1-testvol1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/testpool1-testvol1", "_raw_device": "/dev/mapper/testpool1-testvol1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "testvol1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "1g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.038) 0:00:27.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.033) 0:00:27.758 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.032) 0:00:27.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.028) 0:00:27.819 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.027) 0:00:27.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:14:19 +0000 (0:00:00.028) 0:00:27.875 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:14:20 +0000 (0:00:00.391) 0:00:28.267 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:14:20 +0000 (0:00:00.028) 0:00:28.295 ******** ok: [/cache/rhel-x.qcow2] TASK [Try to replace file system in safe mode] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:254 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.863) 0:00:29.159 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.050) 0:00:29.209 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.045) 0:00:29.254 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.515) 0:00:29.770 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.075) 0:00:29.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.031) 0:00:29.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.030) 0:00:29.908 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.060) 0:00:29.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.025) 0:00:29.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.029) 0:00:30.024 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "testpool1", "type": "lvm", "volumes": [ { "fs_type": "ext3", "name": "testvol1", "size": "1g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.035) 0:00:30.059 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.032) 0:00:30.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:21 +0000 (0:00:00.032) 0:00:30.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:22 +0000 (0:00:00.067) 0:00:30.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:22 +0000 (0:00:00.030) 0:00:30.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:22 +0000 (0:00:00.030) 0:00:30.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:22 +0000 (0:00:00.044) 0:00:30.297 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:22 +0000 (0:00:00.026) 0:00:30.324 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'testvol1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:23 +0000 (0:00:01.278) 0:00:31.602 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'testpool1', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': u'1g', u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'testvol1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'testvol1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:23 +0000 (0:00:00.043) 0:00:31.645 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:272 Wednesday 01 June 2022 17:14:23 +0000 (0:00:00.027) 0:00:31.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:278 Wednesday 01 June 2022 17:14:23 +0000 (0:00:00.034) 0:00:31.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM pool on disks that already belong to an existing pool] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:287 Wednesday 01 June 2022 17:14:23 +0000 (0:00:00.035) 0:00:31.743 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:23 +0000 (0:00:00.048) 0:00:31.792 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:23 +0000 (0:00:00.048) 0:00:31.840 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.512) 0:00:32.353 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.071) 0:00:32.424 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.030) 0:00:32.455 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.038) 0:00:32.493 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.061) 0:00:32.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.034) 0:00:32.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.030) 0:00:32.619 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.035) 0:00:32.655 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.037) 0:00:32.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.027) 0:00:32.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.028) 0:00:32.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.038) 0:00:32.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.029) 0:00:32.817 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.044) 0:00:32.861 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:24 +0000 (0:00:00.028) 0:00:32.889 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:25 +0000 (0:00:01.258) 0:00:34.148 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.040) 0:00:34.188 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:301 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.027) 0:00:34.215 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:307 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.033) 0:00:34.249 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to replace a pool by a file system on disk in safe mode] ************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:316 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.032) 0:00:34.281 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.047) 0:00:34.329 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.042) 0:00:34.372 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.523) 0:00:34.895 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.071) 0:00:34.967 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.031) 0:00:34.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.031) 0:00:35.030 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.063) 0:00:35.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.026) 0:00:35.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:26 +0000 (0:00:00.030) 0:00:35.151 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.032) 0:00:35.184 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext3", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.053) 0:00:35.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.030) 0:00:35.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.032) 0:00:35.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.029) 0:00:35.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.029) 0:00:35.359 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.043) 0:00:35.403 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:27 +0000 (0:00:00.027) 0:00:35.430 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'test1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:28 +0000 (0:00:01.332) 0:00:36.763 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'test1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:28 +0000 (0:00:00.044) 0:00:36.807 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:332 Wednesday 01 June 2022 17:14:28 +0000 (0:00:00.033) 0:00:36.841 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:338 Wednesday 01 June 2022 17:14:28 +0000 (0:00:00.035) 0:00:36.877 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:345 Wednesday 01 June 2022 17:14:28 +0000 (0:00:00.035) 0:00:36.912 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:28 +0000 (0:00:00.049) 0:00:36.962 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:28 +0000 (0:00:00.044) 0:00:37.006 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.523) 0:00:37.529 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.071) 0:00:37.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.071) 0:00:37.672 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.034) 0:00:37.707 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.062) 0:00:37.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.025) 0:00:37.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.033) 0:00:37.828 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "testpool1", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.035) 0:00:37.863 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.032) 0:00:37.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.030) 0:00:37.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.031) 0:00:37.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.033) 0:00:37.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.032) 0:00:38.024 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.047) 0:00:38.072 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:29 +0000 (0:00:00.029) 0:00:38.101 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "destroy device", "device": "/dev/testpool1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:14:31 +0000 (0:00:01.666) 0:00:39.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.030) 0:00:39.798 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.026) 0:00:39.825 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "destroy device", "device": "/dev/testpool1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.038) 0:00:39.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.036) 0:00:39.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.035) 0:00:39.934 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.029) 0:00:39.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.031) 0:00:39.996 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.029) 0:00:40.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:14:31 +0000 (0:00:00.030) 0:00:40.056 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:14:32 +0000 (0:00:00.383) 0:00:40.440 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:14:32 +0000 (0:00:00.030) 0:00:40.470 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=152 changed=2 unreachable=0 failed=9 skipped=113 rescued=9 ignored=0 Wednesday 01 June 2022 17:14:33 +0000 (0:00:00.864) 0:00:41.334 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.33s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.28s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:2 --------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure required packages are installed --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:14:33 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:14:35 +0000 (0:00:01.284) 0:00:01.307 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_errors_nvme_generated.yml ********************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_errors_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:14:35 +0000 (0:00:00.036) 0:00:01.344 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:14:35 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:14:37 +0000 (0:00:01.297) 0:00:01.320 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_errors_scsi_generated.yml ********************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_errors_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors_scsi_generated.yml:3 Wednesday 01 June 2022 17:14:37 +0000 (0:00:00.032) 0:00:01.353 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors_scsi_generated.yml:7 Wednesday 01 June 2022 17:14:38 +0000 (0:00:01.069) 0:00:02.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:2 Wednesday 01 June 2022 17:14:38 +0000 (0:00:00.028) 0:00:02.451 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:18 Wednesday 01 June 2022 17:14:39 +0000 (0:00:00.803) 0:00:03.254 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:39 +0000 (0:00:00.038) 0:00:03.293 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:39 +0000 (0:00:00.153) 0:00:03.446 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:39 +0000 (0:00:00.513) 0:00:03.959 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:39 +0000 (0:00:00.078) 0:00:04.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:40 +0000 (0:00:00.023) 0:00:04.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:40 +0000 (0:00:00.022) 0:00:04.084 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:40 +0000 (0:00:00.194) 0:00:04.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:40 +0000 (0:00:00.018) 0:00:04.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:41 +0000 (0:00:01.084) 0:00:05.381 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:41 +0000 (0:00:00.046) 0:00:05.427 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:41 +0000 (0:00:00.044) 0:00:05.472 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:42 +0000 (0:00:00.671) 0:00:06.144 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:14:42 +0000 (0:00:00.080) 0:00:06.224 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:14:42 +0000 (0:00:00.021) 0:00:06.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:14:42 +0000 (0:00:00.022) 0:00:06.268 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:42 +0000 (0:00:00.020) 0:00:06.289 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:43 +0000 (0:00:00.833) 0:00:07.122 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:44 +0000 (0:00:01.794) 0:00:08.916 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:44 +0000 (0:00:00.044) 0:00:08.961 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:44 +0000 (0:00:00.027) 0:00:08.989 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.545) 0:00:09.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.030) 0:00:09.565 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.026) 0:00:09.592 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.031) 0:00:09.624 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.031) 0:00:09.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.032) 0:00:09.688 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.028) 0:00:09.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.030) 0:00:09.747 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.027) 0:00:09.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:14:45 +0000 (0:00:00.027) 0:00:09.802 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:14:46 +0000 (0:00:00.473) 0:00:10.276 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:14:46 +0000 (0:00:00.029) 0:00:10.305 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:21 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.857) 0:00:11.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:28 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.029) 0:00:11.191 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.040) 0:00:11.232 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.496) 0:00:11.728 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.035) 0:00:11.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.029) 0:00:11.794 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Try to create LVM with an invalid disk specification.] ******************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:35 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.032) 0:00:11.827 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.047) 0:00:11.874 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:47 +0000 (0:00:00.044) 0:00:11.919 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.508) 0:00:12.427 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.071) 0:00:12.498 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.029) 0:00:12.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.029) 0:00:12.557 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.093) 0:00:12.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.025) 0:00:12.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.028) 0:00:12.705 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "/non/existent/disk" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.036) 0:00:12.742 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.033) 0:00:12.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.031) 0:00:12.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.029) 0:00:12.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.028) 0:00:12.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.028) 0:00:12.895 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.043) 0:00:12.938 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:48 +0000 (0:00:00.027) 0:00:12.965 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: unable to resolve any disks specified for pool 'foo' (['/non/existent/disk']) TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:49 +0000 (0:00:00.994) 0:00:13.960 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'/non/existent/disk'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'5g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"unable to resolve any disks specified for pool 'foo' (['/non/existent/disk'])"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:49 +0000 (0:00:00.040) 0:00:14.000 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:52 Wednesday 01 June 2022 17:14:49 +0000 (0:00:00.028) 0:00:14.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM with an invalid size specification.] ******************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:73 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.035) 0:00:14.063 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.056) 0:00:14.120 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.043) 0:00:14.163 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.487) 0:00:14.651 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.071) 0:00:14.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.032) 0:00:14.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.032) 0:00:14.788 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.063) 0:00:14.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.025) 0:00:14.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.031) 0:00:14.908 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "xyz GiB" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.035) 0:00:14.943 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.031) 0:00:14.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.028) 0:00:15.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:50 +0000 (0:00:00.030) 0:00:15.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:51 +0000 (0:00:00.028) 0:00:15.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:51 +0000 (0:00:00.030) 0:00:15.094 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:51 +0000 (0:00:00.041) 0:00:15.136 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:51 +0000 (0:00:00.026) 0:00:15.163 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: invalid size specification for volume 'test1': 'xyz GiB' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.986) 0:00:16.149 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'xyz GiB', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"invalid size specification for volume 'test1': 'xyz GiB'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.040) 0:00:16.190 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:90 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.026) 0:00:16.216 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM pool with no disks specified.] ************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:106 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.034) 0:00:16.250 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.046) 0:00:16.297 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.041) 0:00:16.339 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.518) 0:00:16.857 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.072) 0:00:16.929 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.035) 0:00:16.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:52 +0000 (0:00:00.032) 0:00:16.997 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.062) 0:00:17.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.025) 0:00:17.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.030) 0:00:17.116 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.034) 0:00:17.150 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.034) 0:00:17.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.032) 0:00:17.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.029) 0:00:17.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.029) 0:00:17.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.028) 0:00:17.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.045) 0:00:17.350 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:53 +0000 (0:00:00.027) 0:00:17.377 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: no disks specified for pool 'foo' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:54 +0000 (0:00:01.019) 0:00:18.397 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'5g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': None, u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"no disks specified for pool 'foo'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:54 +0000 (0:00:00.041) 0:00:18.438 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:123 Wednesday 01 June 2022 17:14:54 +0000 (0:00:00.027) 0:00:18.466 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM volume from outside of any pool.] ********************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:139 Wednesday 01 June 2022 17:14:54 +0000 (0:00:00.036) 0:00:18.502 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:54 +0000 (0:00:00.048) 0:00:18.551 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:54 +0000 (0:00:00.044) 0:00:18.595 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.513) 0:00:19.109 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.069) 0:00:19.179 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.030) 0:00:19.209 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.030) 0:00:19.239 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.061) 0:00:19.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.056) 0:00:19.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.030) 0:00:19.388 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.030) 0:00:19.419 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [], "mount_point": "/opt/test1", "name": "test1", "size": "5g", "type": "lvm" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.032) 0:00:19.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.030) 0:00:19.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.029) 0:00:19.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.028) 0:00:19.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.029) 0:00:19.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.043) 0:00:19.613 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:55 +0000 (0:00:00.027) 0:00:19.641 ******** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'NoneType' object has no attribute '_device' fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.9/runpy.py", line 210, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 549, in manage File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 357, in _look_up_device File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 676, in _get_device_id AttributeError: 'NoneType' object has no attribute '_device' MODULE_STDERR: Shared connection to 127.0.0.3 closed. TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:56 +0000 (0:00:01.005) 0:00:20.647 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'exception': u'Traceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 549, in manage\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 357, in _look_up_device\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 676, in _get_device_id\r\nAttributeError: \'NoneType\' object has no attribute \'_device\'\r\n', u'_ansible_no_log': False, u'module_stderr': u'Shared connection to 127.0.0.3 closed.\r\n', u'changed': False, u'module_stdout': u'Traceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654103695.65-99138-95618345939871/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 549, in manage\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 357, in _look_up_device\r\n File "/tmp/ansible_blivet_payload_f7z_o122/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 676, in _get_device_id\r\nAttributeError: \'NoneType\' object has no attribute \'_device\'\r\n', u'failed': True, u'rc': 1, u'msg': u'MODULE FAILURE\nSee stdout/stderr for the exact error'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:56 +0000 (0:00:00.035) 0:00:20.682 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:155 Wednesday 01 June 2022 17:14:56 +0000 (0:00:00.026) 0:00:20.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create two pools w/ the same name] ******************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:171 Wednesday 01 June 2022 17:14:56 +0000 (0:00:00.033) 0:00:20.742 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:56 +0000 (0:00:00.048) 0:00:20.791 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:56 +0000 (0:00:00.046) 0:00:20.838 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.515) 0:00:21.354 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.068) 0:00:21.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.030) 0:00:21.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.029) 0:00:21.482 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.061) 0:00:21.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.025) 0:00:21.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.030) 0:00:21.600 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "pool1", "type": "lvm" }, { "disks": [ "sda" ], "name": "pool1", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.037) 0:00:21.638 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.033) 0:00:21.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.030) 0:00:21.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.029) 0:00:21.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.029) 0:00:21.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.029) 0:00:21.790 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.046) 0:00:21.836 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:14:57 +0000 (0:00:00.027) 0:00:21.864 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: multiple pools with the same name: pool1 TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:14:58 +0000 (0:00:01.042) 0:00:22.906 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'name': u'pool1', u'encryption_password': None, u'raid_metadata_version': None, u'encryption': None, u'encryption_key_size': None, u'disks': [u'sda'], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, {u'name': u'pool1', u'encryption_password': None, u'raid_metadata_version': None, u'encryption': None, u'encryption_key_size': None, u'disks': [u'sda'], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u'multiple pools with the same name: pool1'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:14:58 +0000 (0:00:00.038) 0:00:22.945 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:188 Wednesday 01 June 2022 17:14:58 +0000 (0:00:00.028) 0:00:22.973 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the duplicate pools test] *************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:194 Wednesday 01 June 2022 17:14:58 +0000 (0:00:00.068) 0:00:23.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create two volumes w/ the same name] ****************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:203 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.034) 0:00:23.076 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.047) 0:00:23.124 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.042) 0:00:23.166 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.526) 0:00:23.693 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.069) 0:00:23.762 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.029) 0:00:23.791 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.029) 0:00:23.821 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.061) 0:00:23.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.024) 0:00:23.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.029) 0:00:23.936 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "pool1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g", "type": "lvm" }, { "mount_point": "/opt/test2", "name": "test1", "size": "2g", "type": "lvm" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.037) 0:00:23.974 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.034) 0:00:24.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:14:59 +0000 (0:00:00.034) 0:00:24.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:00 +0000 (0:00:00.035) 0:00:24.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:00 +0000 (0:00:00.032) 0:00:24.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:00 +0000 (0:00:00.028) 0:00:24.140 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:00 +0000 (0:00:00.045) 0:00:24.186 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:00 +0000 (0:00:00.030) 0:00:24.216 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: multiple volumes in pool 'pool1' with the same name: test1 TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:15:01 +0000 (0:00:01.026) 0:00:25.242 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'name': u'pool1', u'encryption_password': None, u'raid_metadata_version': None, u'encryption': None, u'encryption_key_size': None, u'disks': [u'sda'], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'5g', u'cache_mode': None, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': u'lvm', u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}, {u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'fs_type': None, u'mount_options': None, u'raid_disks': [], u'size': u'2g', u'cache_mode': None, u'mount_point': u'/opt/test2', u'compression': None, u'encryption_password': None, u'cached': None, u'encryption': None, u'raid_level': None, u'name': u'test1', u'state': u'present', u'vdo_pool_size': None, u'cache_size': None, u'cache_devices': [], u'type': u'lvm', u'encryption_cipher': None, u'fs_create_options': None, u'deduplication': None}], u'encryption_cipher': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"multiple volumes in pool 'pool1' with the same name: test1"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:01 +0000 (0:00:00.042) 0:00:25.285 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:226 Wednesday 01 June 2022 17:15:01 +0000 (0:00:00.027) 0:00:25.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the duplicate volumes test] ************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:232 Wednesday 01 June 2022 17:15:01 +0000 (0:00:00.035) 0:00:25.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a pool] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:239 Wednesday 01 June 2022 17:15:01 +0000 (0:00:00.033) 0:00:25.381 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:01 +0000 (0:00:00.046) 0:00:25.428 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:01 +0000 (0:00:00.042) 0:00:25.470 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:01 +0000 (0:00:00.538) 0:00:26.008 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.067) 0:00:26.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.027) 0:00:26.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.027) 0:00:26.130 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.059) 0:00:26.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.022) 0:00:26.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.030) 0:00:26.243 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "testpool1", "type": "lvm", "volumes": [ { "fs_type": "ext4", "name": "testvol1", "size": "1g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.035) 0:00:26.279 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.079) 0:00:26.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.030) 0:00:26.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.030) 0:00:26.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.028) 0:00:26.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.028) 0:00:26.478 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.041) 0:00:26.519 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:02 +0000 (0:00:00.030) 0:00:26.550 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/testpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/testpool1-testvol1" ], "mounts": [], "packages": [ "e2fsprogs", "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/testpool1-testvol1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/testpool1-testvol1", "_raw_device": "/dev/mapper/testpool1-testvol1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "testvol1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "1g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:15:04 +0000 (0:00:01.755) 0:00:28.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.031) 0:00:28.337 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.028) 0:00:28.365 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/testpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/testpool1-testvol1" ], "mounts": [], "packages": [ "e2fsprogs", "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/testpool1-testvol1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/testpool1-testvol1", "_raw_device": "/dev/mapper/testpool1-testvol1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "testvol1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "1g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.039) 0:00:28.405 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/testpool1-testvol1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/testpool1-testvol1", "_raw_device": "/dev/mapper/testpool1-testvol1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "name": "testvol1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "1g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.038) 0:00:28.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.032) 0:00:28.476 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.030) 0:00:28.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.032) 0:00:28.539 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.032) 0:00:28.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.031) 0:00:28.603 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.369) 0:00:28.973 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:15:04 +0000 (0:00:00.030) 0:00:29.004 ******** ok: [/cache/rhel-x.qcow2] TASK [Try to replace file system in safe mode] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:254 Wednesday 01 June 2022 17:15:05 +0000 (0:00:00.852) 0:00:29.857 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:05 +0000 (0:00:00.051) 0:00:29.908 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:05 +0000 (0:00:00.044) 0:00:29.953 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.525) 0:00:30.478 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.075) 0:00:30.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.032) 0:00:30.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.033) 0:00:30.619 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.106) 0:00:30.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.026) 0:00:30.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.031) 0:00:30.784 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "testpool1", "type": "lvm", "volumes": [ { "fs_type": "ext3", "name": "testvol1", "size": "1g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.036) 0:00:30.820 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.032) 0:00:30.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.031) 0:00:30.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.032) 0:00:30.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.030) 0:00:30.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.029) 0:00:30.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.043) 0:00:31.021 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:06 +0000 (0:00:00.027) 0:00:31.048 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'testvol1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:15:08 +0000 (0:00:01.232) 0:00:32.281 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'testpool1', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': u'1g', u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'testvol1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'testvol1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:08 +0000 (0:00:00.042) 0:00:32.324 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:272 Wednesday 01 June 2022 17:15:08 +0000 (0:00:00.028) 0:00:32.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:278 Wednesday 01 June 2022 17:15:08 +0000 (0:00:00.033) 0:00:32.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM pool on disks that already belong to an existing pool] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:287 Wednesday 01 June 2022 17:15:08 +0000 (0:00:00.033) 0:00:32.419 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:08 +0000 (0:00:00.047) 0:00:32.466 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:08 +0000 (0:00:00.044) 0:00:32.511 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:08 +0000 (0:00:00.512) 0:00:33.023 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.071) 0:00:33.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.030) 0:00:33.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.030) 0:00:33.156 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.061) 0:00:33.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.025) 0:00:33.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.029) 0:00:33.272 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.035) 0:00:33.308 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.033) 0:00:33.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.030) 0:00:33.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.029) 0:00:33.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.029) 0:00:33.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.029) 0:00:33.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.045) 0:00:33.505 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:09 +0000 (0:00:00.032) 0:00:33.538 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:15:10 +0000 (0:00:01.233) 0:00:34.771 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting and/or devices on disk 'sda' (pool 'foo') in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:10 +0000 (0:00:00.042) 0:00:34.814 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:301 Wednesday 01 June 2022 17:15:10 +0000 (0:00:00.030) 0:00:34.844 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:307 Wednesday 01 June 2022 17:15:10 +0000 (0:00:00.036) 0:00:34.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to replace a pool by a file system on disk in safe mode] ************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:316 Wednesday 01 June 2022 17:15:10 +0000 (0:00:00.036) 0:00:34.917 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:10 +0000 (0:00:00.092) 0:00:35.010 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:10 +0000 (0:00:00.043) 0:00:35.054 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.508) 0:00:35.562 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.073) 0:00:35.636 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.031) 0:00:35.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.032) 0:00:35.700 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.065) 0:00:35.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.026) 0:00:35.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.031) 0:00:35.824 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.033) 0:00:35.857 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "ext3", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.034) 0:00:35.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.029) 0:00:35.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.031) 0:00:35.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.030) 0:00:35.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:11 +0000 (0:00:00.031) 0:00:36.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:12 +0000 (0:00:00.044) 0:00:36.059 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:12 +0000 (0:00:00.028) 0:00:36.088 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on volume 'test1' in safe mode TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:15:13 +0000 (0:00:01.229) 0:00:37.318 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext3', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': None, u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on volume 'test1' in safe mode"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:13 +0000 (0:00:00.039) 0:00:37.357 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:332 Wednesday 01 June 2022 17:15:13 +0000 (0:00:00.026) 0:00:37.384 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:338 Wednesday 01 June 2022 17:15:13 +0000 (0:00:00.033) 0:00:37.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:345 Wednesday 01 June 2022 17:15:13 +0000 (0:00:00.032) 0:00:37.450 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:13 +0000 (0:00:00.048) 0:00:37.499 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:13 +0000 (0:00:00.045) 0:00:37.544 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:13 +0000 (0:00:00.508) 0:00:38.053 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.071) 0:00:38.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.030) 0:00:38.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.029) 0:00:38.184 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.093) 0:00:38.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.025) 0:00:38.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.029) 0:00:38.333 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "testpool1", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.036) 0:00:38.369 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.035) 0:00:38.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.030) 0:00:38.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.029) 0:00:38.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.030) 0:00:38.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.030) 0:00:38.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.043) 0:00:38.569 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:14 +0000 (0:00:00.030) 0:00:38.600 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "destroy device", "device": "/dev/testpool1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:15:16 +0000 (0:00:01.677) 0:00:40.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.032) 0:00:40.309 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.027) 0:00:40.337 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/testpool1-testvol1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/testpool1-testvol1", "fs_type": null }, { "action": "destroy device", "device": "/dev/testpool1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.036) 0:00:40.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "testpool1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.034) 0:00:40.407 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.033) 0:00:40.441 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.029) 0:00:40.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.031) 0:00:40.501 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.029) 0:00:40.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.030) 0:00:40.561 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.385) 0:00:40.946 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:15:16 +0000 (0:00:00.030) 0:00:40.976 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=154 changed=2 unreachable=0 failed=9 skipped=113 rescued=9 ignored=0 Wednesday 01 June 2022 17:15:17 +0000 (0:00:00.793) 0:00:41.770 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.76s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.30s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.23s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.23s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.23s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.07s /tmp/tmp7247_7fr/tests/tests_lvm_errors_scsi_generated.yml:3 ------------------ linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.80s /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml:2 --------------------------------- linux-system-roles.storage : Update facts ------------------------------- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:15:18 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:15:19 +0000 (0:00:01.257) 0:00:01.279 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_multiple_disks_multiple_volumes.yml ************************ 1 plays in /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:2 Wednesday 01 June 2022 17:15:19 +0000 (0:00:00.013) 0:00:01.292 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:13 Wednesday 01 June 2022 17:15:20 +0000 (0:00:01.089) 0:00:02.382 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:20 +0000 (0:00:00.037) 0:00:02.420 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:21 +0000 (0:00:00.155) 0:00:02.575 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:21 +0000 (0:00:00.535) 0:00:03.111 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:21 +0000 (0:00:00.075) 0:00:03.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:21 +0000 (0:00:00.023) 0:00:03.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:21 +0000 (0:00:00.024) 0:00:03.234 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:21 +0000 (0:00:00.189) 0:00:03.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:21 +0000 (0:00:00.019) 0:00:03.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:22 +0000 (0:00:01.094) 0:00:04.537 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:23 +0000 (0:00:00.047) 0:00:04.585 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:23 +0000 (0:00:00.045) 0:00:04.630 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:23 +0000 (0:00:00.670) 0:00:05.301 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:15:23 +0000 (0:00:00.080) 0:00:05.381 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:15:23 +0000 (0:00:00.022) 0:00:05.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:15:23 +0000 (0:00:00.024) 0:00:05.428 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:23 +0000 (0:00:00.022) 0:00:05.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:24 +0000 (0:00:00.804) 0:00:06.256 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:26 +0000 (0:00:01.798) 0:00:08.054 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:26 +0000 (0:00:00.044) 0:00:08.098 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:26 +0000 (0:00:00.028) 0:00:08.127 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.522) 0:00:08.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.030) 0:00:08.680 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.027) 0:00:08.708 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.030) 0:00:08.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.031) 0:00:08.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.032) 0:00:08.803 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.031) 0:00:08.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.057) 0:00:08.891 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.028) 0:00:08.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.028) 0:00:08.949 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.496) 0:00:09.446 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:15:27 +0000 (0:00:00.029) 0:00:09.475 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:16 Wednesday 01 June 2022 17:15:28 +0000 (0:00:00.826) 0:00:10.302 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:23 Wednesday 01 June 2022 17:15:28 +0000 (0:00:00.030) 0:00:10.333 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:15:28 +0000 (0:00:00.043) 0:00:10.376 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:15:29 +0000 (0:00:00.489) 0:00:10.866 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:15:29 +0000 (0:00:00.036) 0:00:10.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:15:29 +0000 (0:00:00.030) 0:00:10.932 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a logical volume spanning two physical volumes that changes its mount location] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:29 Wednesday 01 June 2022 17:15:29 +0000 (0:00:00.032) 0:00:10.965 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:29 +0000 (0:00:00.051) 0:00:11.017 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:29 +0000 (0:00:00.041) 0:00:11.058 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.522) 0:00:11.580 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.067) 0:00:11.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.028) 0:00:11.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.029) 0:00:11.706 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.056) 0:00:11.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.025) 0:00:11.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.027) 0:00:11.815 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "phi", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.039) 0:00:11.855 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.032) 0:00:11.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.029) 0:00:11.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.029) 0:00:11.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.029) 0:00:11.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.029) 0:00:12.005 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.075) 0:00:12.081 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:30 +0000 (0:00:00.029) 0:00:12.111 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/phi", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:15:32 +0000 (0:00:02.251) 0:00:14.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:32 +0000 (0:00:00.030) 0:00:14.393 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:15:32 +0000 (0:00:00.035) 0:00:14.429 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/phi", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:15:32 +0000 (0:00:00.044) 0:00:14.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:15:32 +0000 (0:00:00.039) 0:00:14.513 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:15:33 +0000 (0:00:00.034) 0:00:14.548 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:15:33 +0000 (0:00:00.029) 0:00:14.577 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:15:33 +0000 (0:00:00.920) 0:00:15.498 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:15:34 +0000 (0:00:00.925) 0:00:16.423 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:15:35 +0000 (0:00:00.661) 0:00:17.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:15:35 +0000 (0:00:00.386) 0:00:17.471 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:15:35 +0000 (0:00:00.029) 0:00:17.501 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:44 Wednesday 01 June 2022 17:15:36 +0000 (0:00:00.871) 0:00:18.372 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:15:36 +0000 (0:00:00.054) 0:00:18.426 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:15:36 +0000 (0:00:00.043) 0:00:18.470 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:15:36 +0000 (0:00:00.029) 0:00:18.499 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/phi-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test1", "size": "4G", "type": "lvm", "uuid": "caa8948f-85d1-4b11-9f25-df2684bdbc4f" }, "/dev/mapper/phi-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test2", "size": "4G", "type": "lvm", "uuid": "6a626d28-8ca5-4b7e-8487-41ab60185d36" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "VerFIg-QoYk-I7al-Kr9k-LbnY-rp32-V6kB7x" }, "/dev/sdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "aEO0p4-767I-wLRd-9qx4-g14M-2Jov-pFvji0" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:15:37 +0000 (0:00:00.501) 0:00:19.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003136", "end": "2022-06-01 13:15:37.340967", "rc": 0, "start": "2022-06-01 13:15:37.337831" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/phi-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/phi-test2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:15:37 +0000 (0:00:00.539) 0:00:19.541 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002642", "end": "2022-06-01 13:15:37.722309", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:15:37.719667" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:15:38 +0000 (0:00:00.410) 0:00:19.951 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:15:38 +0000 (0:00:00.069) 0:00:20.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:15:38 +0000 (0:00:00.033) 0:00:20.053 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:15:38 +0000 (0:00:00.064) 0:00:20.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb", "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:15:38 +0000 (0:00:00.040) 0:00:20.158 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb", "pv": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.787) 0:00:20.946 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb", "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.049) 0:00:20.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.036) 0:00:21.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.033) 0:00:21.065 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.036) 0:00:21.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.029) 0:00:21.130 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.049) 0:00:21.179 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.054) 0:00:21.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.029) 0:00:21.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.030) 0:00:21.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.028) 0:00:21.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.036) 0:00:21.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.036) 0:00:21.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.038) 0:00:21.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.033) 0:00:21.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:15:39 +0000 (0:00:00.032) 0:00:21.499 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.059) 0:00:21.558 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.072) 0:00:21.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.029) 0:00:21.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.028) 0:00:21.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.027) 0:00:21.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.028) 0:00:21.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.034) 0:00:21.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.031) 0:00:21.811 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.061) 0:00:21.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.036) 0:00:21.909 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "_storage_test_pool_member_path": "/dev/sdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.047) 0:00:21.956 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.071) 0:00:22.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.039) 0:00:22.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.048) 0:00:22.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.057) 0:00:22.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.046) 0:00:22.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.045) 0:00:22.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.061) 0:00:22.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.068) 0:00:22.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:15:40 +0000 (0:00:00.067) 0:00:22.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.106) 0:00:22.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:22.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.031) 0:00:22.634 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:22.667 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.031) 0:00:22.698 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.063) 0:00:22.762 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.080) 0:00:22.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:22.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.030) 0:00:22.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.031) 0:00:22.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.031) 0:00:22.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.033) 0:00:23.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.034) 0:00:23.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.030) 0:00:23.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.031) 0:00:23.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.031) 0:00:23.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.032) 0:00:23.389 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.071) 0:00:23.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:15:41 +0000 (0:00:00.037) 0:00:23.498 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.140) 0:00:23.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.039) 0:00:23.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "caa8948f-85d1-4b11-9f25-df2684bdbc4f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "caa8948f-85d1-4b11-9f25-df2684bdbc4f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.047) 0:00:23.726 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.040) 0:00:23.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.038) 0:00:23.805 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.037) 0:00:23.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.031) 0:00:23.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.033) 0:00:23.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.031) 0:00:23.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.031) 0:00:23.971 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.047) 0:00:24.019 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.036) 0:00:24.055 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.042) 0:00:24.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.032) 0:00:24.130 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.034) 0:00:24.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.044) 0:00:24.209 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:15:42 +0000 (0:00:00.040) 0:00:24.249 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103732.1001215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103732.1001215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 16957, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103732.1001215, "nlink": 1, "path": "/dev/mapper/phi-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.393) 0:00:24.642 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.037) 0:00:24.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.035) 0:00:24.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.083) 0:00:24.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.032) 0:00:24.832 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.035) 0:00:24.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.032) 0:00:24.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.030) 0:00:24.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:24.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.039) 0:00:25.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.029) 0:00:25.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.032) 0:00:25.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:25.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:25.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:25.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.039) 0:00:25.199 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.036) 0:00:25.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:25.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:25.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.030) 0:00:25.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:25.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.032) 0:00:25.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.032) 0:00:25.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.032) 0:00:25.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.031) 0:00:25.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:15:43 +0000 (0:00:00.032) 0:00:25.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:15:44 +0000 (0:00:00.032) 0:00:25.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:15:44 +0000 (0:00:00.034) 0:00:25.589 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:15:44 +0000 (0:00:00.464) 0:00:26.054 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:15:44 +0000 (0:00:00.374) 0:00:26.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:15:44 +0000 (0:00:00.038) 0:00:26.467 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:15:44 +0000 (0:00:00.033) 0:00:26.501 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:15:44 +0000 (0:00:00.031) 0:00:26.532 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.030) 0:00:26.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.036) 0:00:26.599 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.034) 0:00:26.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.031) 0:00:26.664 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.035) 0:00:26.700 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.034) 0:00:26.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.040) 0:00:26.774 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test1" ], "delta": "0:00:00.039665", "end": "2022-06-01 13:15:45.011135", "rc": 0, "start": "2022-06-01 13:15:44.971470" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.433) 0:00:27.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.038) 0:00:27.247 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.041) 0:00:27.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.033) 0:00:27.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.033) 0:00:27.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.033) 0:00:27.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.033) 0:00:27.423 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.035) 0:00:27.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:15:45 +0000 (0:00:00.042) 0:00:27.500 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.181) 0:00:27.681 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.038) 0:00:27.719 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6a626d28-8ca5-4b7e-8487-41ab60185d36" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6a626d28-8ca5-4b7e-8487-41ab60185d36" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.044) 0:00:27.764 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.039) 0:00:27.803 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.039) 0:00:27.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.037) 0:00:27.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.031) 0:00:27.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.031) 0:00:27.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.031) 0:00:27.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.033) 0:00:28.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.051) 0:00:28.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.035) 0:00:28.095 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.037) 0:00:28.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.031) 0:00:28.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.031) 0:00:28.195 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.040) 0:00:28.235 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:15:46 +0000 (0:00:00.038) 0:00:28.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103731.8421216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103731.8421216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 16919, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103731.8421216, "nlink": 1, "path": "/dev/mapper/phi-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.380) 0:00:28.654 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.039) 0:00:28.694 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.036) 0:00:28.731 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.034) 0:00:28.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.031) 0:00:28.796 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.037) 0:00:28.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:28.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:28.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:28.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.037) 0:00:28.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:28.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.032) 0:00:29.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:29.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.029) 0:00:29.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:29.116 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.038) 0:00:29.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.037) 0:00:29.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.034) 0:00:29.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:29.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:29.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.031) 0:00:29.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.030) 0:00:29.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.031) 0:00:29.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.038) 0:00:29.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.037) 0:00:29.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.033) 0:00:29.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:15:47 +0000 (0:00:00.031) 0:00:29.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.031) 0:00:29.553 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.390) 0:00:29.943 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.367) 0:00:30.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.038) 0:00:30.349 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.034) 0:00:30.384 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.069) 0:00:30.454 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.033) 0:00:30.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:15:48 +0000 (0:00:00.032) 0:00:30.520 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.031) 0:00:30.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.031) 0:00:30.583 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.034) 0:00:30.617 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.037) 0:00:30.655 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.040) 0:00:30.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test2" ], "delta": "0:00:00.034410", "end": "2022-06-01 13:15:48.915537", "rc": 0, "start": "2022-06-01 13:15:48.881127" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.412) 0:00:31.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.039) 0:00:31.147 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.039) 0:00:31.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.033) 0:00:31.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.033) 0:00:31.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.034) 0:00:31.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.032) 0:00:31.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.031) 0:00:31.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.030) 0:00:31.383 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.029) 0:00:31.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:46 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.032) 0:00:31.445 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:15:49 +0000 (0:00:00.062) 0:00:31.508 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.049) 0:00:31.558 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.591) 0:00:32.149 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.077) 0:00:32.226 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.033) 0:00:32.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.030) 0:00:32.291 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.065) 0:00:32.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.027) 0:00:32.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.031) 0:00:32.415 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "phi", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.037) 0:00:32.453 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.037) 0:00:32.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:15:50 +0000 (0:00:00.030) 0:00:32.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:15:51 +0000 (0:00:00.033) 0:00:32.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:15:51 +0000 (0:00:00.033) 0:00:32.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:15:51 +0000 (0:00:00.033) 0:00:32.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:15:51 +0000 (0:00:00.045) 0:00:32.667 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:15:51 +0000 (0:00:00.034) 0:00:32.701 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:15:52 +0000 (0:00:01.714) 0:00:34.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:15:52 +0000 (0:00:00.040) 0:00:34.457 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:15:52 +0000 (0:00:00.031) 0:00:34.488 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:15:52 +0000 (0:00:00.043) 0:00:34.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:15:53 +0000 (0:00:00.042) 0:00:34.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:15:53 +0000 (0:00:00.036) 0:00:34.611 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:15:53 +0000 (0:00:00.033) 0:00:34.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:15:53 +0000 (0:00:00.668) 0:00:35.313 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:15:54 +0000 (0:00:00.741) 0:00:36.055 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:15:55 +0000 (0:00:00.653) 0:00:36.708 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:15:55 +0000 (0:00:00.383) 0:00:37.092 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:15:55 +0000 (0:00:00.032) 0:00:37.124 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:61 Wednesday 01 June 2022 17:15:56 +0000 (0:00:00.860) 0:00:37.985 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:15:56 +0000 (0:00:00.058) 0:00:38.043 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:15:56 +0000 (0:00:00.096) 0:00:38.139 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:15:56 +0000 (0:00:00.032) 0:00:38.171 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/phi-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test1", "size": "4G", "type": "lvm", "uuid": "caa8948f-85d1-4b11-9f25-df2684bdbc4f" }, "/dev/mapper/phi-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test2", "size": "4G", "type": "lvm", "uuid": "6a626d28-8ca5-4b7e-8487-41ab60185d36" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "VerFIg-QoYk-I7al-Kr9k-LbnY-rp32-V6kB7x" }, "/dev/sdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "aEO0p4-767I-wLRd-9qx4-g14M-2Jov-pFvji0" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:15:57 +0000 (0:00:00.393) 0:00:38.565 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002495", "end": "2022-06-01 13:15:56.740265", "rc": 0, "start": "2022-06-01 13:15:56.737770" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/phi-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/phi-test2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:15:57 +0000 (0:00:00.369) 0:00:38.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002550", "end": "2022-06-01 13:15:57.116536", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:15:57.113986" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:15:57 +0000 (0:00:00.374) 0:00:39.310 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:15:57 +0000 (0:00:00.072) 0:00:39.382 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:15:57 +0000 (0:00:00.038) 0:00:39.421 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:15:57 +0000 (0:00:00.066) 0:00:39.487 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb", "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:15:57 +0000 (0:00:00.042) 0:00:39.530 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb", "pv": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:15:58 +0000 (0:00:00.715) 0:00:40.245 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb", "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:15:58 +0000 (0:00:00.051) 0:00:40.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:15:58 +0000 (0:00:00.040) 0:00:40.338 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:15:58 +0000 (0:00:00.037) 0:00:40.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:15:58 +0000 (0:00:00.036) 0:00:40.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:15:58 +0000 (0:00:00.032) 0:00:40.444 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:15:58 +0000 (0:00:00.050) 0:00:40.495 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.058) 0:00:40.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.030) 0:00:40.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.029) 0:00:40.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.030) 0:00:40.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.032) 0:00:40.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.031) 0:00:40.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.034) 0:00:40.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.032) 0:00:40.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.032) 0:00:40.808 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.058) 0:00:40.866 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.074) 0:00:40.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.032) 0:00:40.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.031) 0:00:41.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.031) 0:00:41.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.029) 0:00:41.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.066) 0:00:41.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.032) 0:00:41.167 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.061) 0:00:41.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.035) 0:00:41.265 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "_storage_test_pool_member_path": "/dev/sdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.040) 0:00:41.305 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.076) 0:00:41.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.039) 0:00:41.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.037) 0:00:41.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.034) 0:00:41.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:15:59 +0000 (0:00:00.032) 0:00:41.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.030) 0:00:41.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.030) 0:00:41.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.034) 0:00:41.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.038) 0:00:41.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.033) 0:00:41.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:41.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:41.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.032) 0:00:41.789 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:41.820 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.075) 0:00:41.895 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.080) 0:00:41.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.034) 0:00:42.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.035) 0:00:42.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.032) 0:00:42.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.033) 0:00:42.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.034) 0:00:42.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.032) 0:00:42.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.035) 0:00:42.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.034) 0:00:42.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:42.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:42.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:42.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:42.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.033) 0:00:42.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:42.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:42.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.032) 0:00:42.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:16:00 +0000 (0:00:00.031) 0:00:42.537 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.071) 0:00:42.608 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.036) 0:00:42.645 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.124) 0:00:42.769 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.036) 0:00:42.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "caa8948f-85d1-4b11-9f25-df2684bdbc4f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "caa8948f-85d1-4b11-9f25-df2684bdbc4f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.045) 0:00:42.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.036) 0:00:42.888 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.034) 0:00:42.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.038) 0:00:42.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.031) 0:00:42.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.031) 0:00:43.024 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.034) 0:00:43.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.085) 0:00:43.144 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.050) 0:00:43.195 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.037) 0:00:43.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.038) 0:00:43.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.033) 0:00:43.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.036) 0:00:43.340 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.037) 0:00:43.378 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:16:01 +0000 (0:00:00.036) 0:00:43.414 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103732.1001215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103732.1001215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 16957, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103732.1001215, "nlink": 1, "path": "/dev/mapper/phi-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.385) 0:00:43.799 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.039) 0:00:43.839 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.037) 0:00:43.876 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.034) 0:00:43.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.034) 0:00:43.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.036) 0:00:43.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.033) 0:00:44.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.032) 0:00:44.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.035) 0:00:44.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.044) 0:00:44.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.037) 0:00:44.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.034) 0:00:44.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.034) 0:00:44.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.033) 0:00:44.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.033) 0:00:44.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.040) 0:00:44.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.041) 0:00:44.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.036) 0:00:44.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.036) 0:00:44.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.032) 0:00:44.488 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:16:02 +0000 (0:00:00.032) 0:00:44.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.031) 0:00:44.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.034) 0:00:44.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.033) 0:00:44.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.033) 0:00:44.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.032) 0:00:44.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.031) 0:00:44.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.031) 0:00:44.749 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.366) 0:00:45.115 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.381) 0:00:45.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:16:03 +0000 (0:00:00.037) 0:00:45.534 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.036) 0:00:45.571 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.031) 0:00:45.602 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.030) 0:00:45.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.031) 0:00:45.665 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.030) 0:00:45.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.031) 0:00:45.727 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.037) 0:00:45.765 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.034) 0:00:45.799 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.046) 0:00:45.846 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test1" ], "delta": "0:00:00.040542", "end": "2022-06-01 13:16:04.071642", "rc": 0, "start": "2022-06-01 13:16:04.031100" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.418) 0:00:46.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.040) 0:00:46.305 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.040) 0:00:46.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.033) 0:00:46.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.032) 0:00:46.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.033) 0:00:46.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.032) 0:00:46.477 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:16:04 +0000 (0:00:00.030) 0:00:46.508 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.036) 0:00:46.544 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.120) 0:00:46.665 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.037) 0:00:46.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6a626d28-8ca5-4b7e-8487-41ab60185d36" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6a626d28-8ca5-4b7e-8487-41ab60185d36" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.045) 0:00:46.748 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.040) 0:00:46.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.035) 0:00:46.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.038) 0:00:46.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.031) 0:00:46.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.033) 0:00:46.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.031) 0:00:46.960 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.031) 0:00:46.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.048) 0:00:47.040 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.034) 0:00:47.075 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.036) 0:00:47.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.033) 0:00:47.144 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.032) 0:00:47.177 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.039) 0:00:47.216 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:16:05 +0000 (0:00:00.037) 0:00:47.253 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103731.8421216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103731.8421216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 16919, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103731.8421216, "nlink": 1, "path": "/dev/mapper/phi-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.394) 0:00:47.648 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.045) 0:00:47.694 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.037) 0:00:47.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.035) 0:00:47.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.031) 0:00:47.799 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.036) 0:00:47.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.037) 0:00:47.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.034) 0:00:47.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.031) 0:00:47.939 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.042) 0:00:47.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.032) 0:00:48.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.032) 0:00:48.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.031) 0:00:48.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.033) 0:00:48.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.033) 0:00:48.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.042) 0:00:48.189 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.037) 0:00:48.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.033) 0:00:48.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.032) 0:00:48.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.032) 0:00:48.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.034) 0:00:48.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.041) 0:00:48.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.033) 0:00:48.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.033) 0:00:48.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.033) 0:00:48.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:16:06 +0000 (0:00:00.032) 0:00:48.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:16:07 +0000 (0:00:00.032) 0:00:48.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:16:07 +0000 (0:00:00.089) 0:00:48.657 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:16:07 +0000 (0:00:00.385) 0:00:49.043 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:16:07 +0000 (0:00:00.399) 0:00:49.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:16:07 +0000 (0:00:00.041) 0:00:49.484 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:16:07 +0000 (0:00:00.036) 0:00:49.521 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.032) 0:00:49.554 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.032) 0:00:49.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.032) 0:00:49.619 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.035) 0:00:49.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.032) 0:00:49.686 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.035) 0:00:49.721 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.035) 0:00:49.757 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.040) 0:00:49.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test2" ], "delta": "0:00:00.035524", "end": "2022-06-01 13:16:08.034552", "rc": 0, "start": "2022-06-01 13:16:07.999028" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.437) 0:00:50.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.041) 0:00:50.277 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.042) 0:00:50.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.033) 0:00:50.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.032) 0:00:50.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.032) 0:00:50.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.035) 0:00:50.454 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.032) 0:00:50.486 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:16:08 +0000 (0:00:00.033) 0:00:50.520 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.028) 0:00:50.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:63 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.030) 0:00:50.580 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.072) 0:00:50.653 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.043) 0:00:50.697 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.505) 0:00:51.202 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.074) 0:00:51.276 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.031) 0:00:51.308 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.031) 0:00:51.340 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.062) 0:00:51.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.029) 0:00:51.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.032) 0:00:51.465 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "phi", "state": "absent", "volumes": [] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.037) 0:00:51.502 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:16:09 +0000 (0:00:00.034) 0:00:51.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:16:10 +0000 (0:00:00.031) 0:00:51.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:16:10 +0000 (0:00:00.032) 0:00:51.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:16:10 +0000 (0:00:00.078) 0:00:51.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:16:10 +0000 (0:00:00.032) 0:00:51.712 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:16:10 +0000 (0:00:00.046) 0:00:51.759 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:16:10 +0000 (0:00:00.029) 0:00:51.788 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/phi", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:16:12 +0000 (0:00:02.462) 0:00:54.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:16:12 +0000 (0:00:00.031) 0:00:54.282 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:16:12 +0000 (0:00:00.030) 0:00:54.312 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/phi", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:16:12 +0000 (0:00:00.039) 0:00:54.351 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:16:12 +0000 (0:00:00.043) 0:00:54.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:16:12 +0000 (0:00:00.045) 0:00:54.440 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:16:13 +0000 (0:00:00.731) 0:00:55.172 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:16:14 +0000 (0:00:00.649) 0:00:55.822 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:16:14 +0000 (0:00:00.032) 0:00:55.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:16:14 +0000 (0:00:00.646) 0:00:56.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:16:15 +0000 (0:00:00.394) 0:00:56.895 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:16:15 +0000 (0:00:00.029) 0:00:56.925 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:73 Wednesday 01 June 2022 17:16:16 +0000 (0:00:00.822) 0:00:57.748 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:16:16 +0000 (0:00:00.062) 0:00:57.810 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:16:16 +0000 (0:00:00.040) 0:00:57.851 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:16:16 +0000 (0:00:00.030) 0:00:57.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:16:16 +0000 (0:00:00.414) 0:00:58.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002612", "end": "2022-06-01 13:16:16.489842", "rc": 0, "start": "2022-06-01 13:16:16.487230" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.387) 0:00:58.684 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002762", "end": "2022-06-01 13:16:16.869321", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:16:16.866559" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.430) 0:00:59.115 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.063) 0:00:59.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.031) 0:00:59.210 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.063) 0:00:59.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.036) 0:00:59.309 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.028) 0:00:59.338 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.030) 0:00:59.368 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.040) 0:00:59.409 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.040) 0:00:59.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.042) 0:00:59.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:16:17 +0000 (0:00:00.032) 0:00:59.524 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.027) 0:00:59.552 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.053) 0:00:59.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.032) 0:00:59.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.038) 0:00:59.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.029) 0:00:59.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.032) 0:00:59.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.032) 0:00:59.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.030) 0:00:59.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.031) 0:00:59.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.036) 0:00:59.869 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.057) 0:00:59.926 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.028) 0:00:59.955 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.061) 0:01:00.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.035) 0:01:00.052 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.027) 0:01:00.079 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.027) 0:01:00.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.029) 0:01:00.135 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.062) 0:01:00.198 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.029) 0:01:00.228 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.031) 0:01:00.260 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.027) 0:01:00.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.029) 0:01:00.317 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.030) 0:01:00.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=352 changed=4 unreachable=0 failed=0 skipped=266 rescued=0 ignored=0 Wednesday 01 June 2022 17:16:18 +0000 (0:00:00.016) 0:01:00.364 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.46s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.25s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.26s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:2 -------- linux-system-roles.storage : set up new/current mounts ------------------ 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Get the canonical device path for each member device -------------------- 0.79s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : set up new/current mounts ------------------ 0.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : remove obsolete mounts --------------------- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Get the canonical device path for each member device -------------------- 0.72s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : get required packages ---------------------- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:16:19 +0000 (0:00:00.021) 0:00:00.021 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:16:20 +0000 (0:00:01.283) 0:00:01.305 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_multiple_disks_multiple_volumes_nvme_generated.yml ********* 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:16:20 +0000 (0:00:00.016) 0:00:01.321 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:16:21 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:16:22 +0000 (0:00:01.279) 0:00:01.303 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_multiple_disks_multiple_volumes_scsi_generated.yml ********* 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes_scsi_generated.yml:3 Wednesday 01 June 2022 17:16:22 +0000 (0:00:00.017) 0:00:01.320 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes_scsi_generated.yml:7 Wednesday 01 June 2022 17:16:23 +0000 (0:00:01.061) 0:00:02.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:2 Wednesday 01 June 2022 17:16:23 +0000 (0:00:00.026) 0:00:02.408 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:13 Wednesday 01 June 2022 17:16:24 +0000 (0:00:00.806) 0:00:03.215 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:16:24 +0000 (0:00:00.038) 0:00:03.254 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:16:24 +0000 (0:00:00.152) 0:00:03.407 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:16:25 +0000 (0:00:00.529) 0:00:03.937 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:16:25 +0000 (0:00:00.079) 0:00:04.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:16:25 +0000 (0:00:00.025) 0:00:04.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:16:25 +0000 (0:00:00.025) 0:00:04.066 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:16:25 +0000 (0:00:00.197) 0:00:04.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:16:25 +0000 (0:00:00.018) 0:00:04.282 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:16:26 +0000 (0:00:01.054) 0:00:05.337 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:16:26 +0000 (0:00:00.046) 0:00:05.384 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:16:27 +0000 (0:00:00.046) 0:00:05.430 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:16:27 +0000 (0:00:00.664) 0:00:06.095 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:16:27 +0000 (0:00:00.080) 0:00:06.176 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:16:27 +0000 (0:00:00.020) 0:00:06.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:16:27 +0000 (0:00:00.022) 0:00:06.219 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:16:27 +0000 (0:00:00.019) 0:00:06.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:16:28 +0000 (0:00:00.826) 0:00:07.065 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:16:30 +0000 (0:00:01.811) 0:00:08.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:16:30 +0000 (0:00:00.043) 0:00:08.920 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:16:30 +0000 (0:00:00.026) 0:00:08.947 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.512) 0:00:09.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.029) 0:00:09.489 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.027) 0:00:09.516 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.032) 0:00:09.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.032) 0:00:09.581 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.031) 0:00:09.612 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.027) 0:00:09.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.030) 0:00:09.671 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.028) 0:00:09.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.028) 0:00:09.728 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.460) 0:00:10.188 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:16:31 +0000 (0:00:00.027) 0:00:10.215 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:16 Wednesday 01 June 2022 17:16:32 +0000 (0:00:00.792) 0:00:11.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:23 Wednesday 01 June 2022 17:16:32 +0000 (0:00:00.029) 0:00:11.038 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:16:32 +0000 (0:00:00.047) 0:00:11.086 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.518) 0:00:11.605 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.034) 0:00:11.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.028) 0:00:11.668 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [Create a logical volume spanning two physical volumes that changes its mount location] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:29 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.031) 0:00:11.699 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.052) 0:00:11.752 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.041) 0:00:11.793 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.492) 0:00:12.285 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.069) 0:00:12.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.029) 0:00:12.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:16:33 +0000 (0:00:00.028) 0:00:12.413 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.093) 0:00:12.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.025) 0:00:12.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.029) 0:00:12.562 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "phi", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.034) 0:00:12.596 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.029) 0:00:12.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.031) 0:00:12.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.028) 0:00:12.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.028) 0:00:12.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.028) 0:00:12.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.041) 0:00:12.784 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:16:34 +0000 (0:00:00.026) 0:00:12.810 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/phi", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:16:36 +0000 (0:00:02.258) 0:00:15.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:16:36 +0000 (0:00:00.030) 0:00:15.099 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:16:36 +0000 (0:00:00.027) 0:00:15.127 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/phi", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:16:36 +0000 (0:00:00.040) 0:00:15.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:16:36 +0000 (0:00:00.037) 0:00:15.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:16:36 +0000 (0:00:00.043) 0:00:15.248 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:16:36 +0000 (0:00:00.031) 0:00:15.280 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:16:37 +0000 (0:00:00.950) 0:00:16.230 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:16:38 +0000 (0:00:00.897) 0:00:17.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:16:39 +0000 (0:00:00.651) 0:00:17.779 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:16:39 +0000 (0:00:00.379) 0:00:18.159 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:16:39 +0000 (0:00:00.029) 0:00:18.189 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:44 Wednesday 01 June 2022 17:16:41 +0000 (0:00:01.259) 0:00:19.449 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:16:41 +0000 (0:00:00.053) 0:00:19.502 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:16:41 +0000 (0:00:00.042) 0:00:19.545 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:16:41 +0000 (0:00:00.065) 0:00:19.611 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/phi-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test1", "size": "4G", "type": "lvm", "uuid": "abdf14cd-b9b2-40ed-88cb-f95fad19a8bf" }, "/dev/mapper/phi-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test2", "size": "4G", "type": "lvm", "uuid": "6fae22a4-e877-4d99-bc0b-277ce925bf92" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qjBzJx-RK4l-Gpre-9j6U-iyWX-Ve33-cHw8wB" }, "/dev/sdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "AzsIl9-5KZU-lqIl-y3N4-RUta-N5eP-gCYDU9" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:16:41 +0000 (0:00:00.488) 0:00:20.099 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003062", "end": "2022-06-01 13:16:41.490516", "rc": 0, "start": "2022-06-01 13:16:41.487454" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/phi-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/phi-test2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:16:42 +0000 (0:00:00.467) 0:00:20.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003131", "end": "2022-06-01 13:16:41.858850", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:16:41.855719" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:16:42 +0000 (0:00:00.366) 0:00:20.933 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:16:42 +0000 (0:00:00.076) 0:00:21.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:16:42 +0000 (0:00:00.032) 0:00:21.043 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:16:42 +0000 (0:00:00.066) 0:00:21.109 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb", "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:16:42 +0000 (0:00:00.040) 0:00:21.150 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb", "pv": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.794) 0:00:21.944 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb", "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.050) 0:00:21.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.038) 0:00:22.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.035) 0:00:22.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.036) 0:00:22.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.029) 0:00:22.135 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.049) 0:00:22.184 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.070) 0:00:22.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.034) 0:00:22.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.030) 0:00:22.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.031) 0:00:22.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.031) 0:00:22.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:16:43 +0000 (0:00:00.032) 0:00:22.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.031) 0:00:22.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.030) 0:00:22.477 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.032) 0:00:22.509 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.059) 0:00:22.569 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.073) 0:00:22.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.029) 0:00:22.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.029) 0:00:22.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.081) 0:00:22.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.031) 0:00:22.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.031) 0:00:22.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.030) 0:00:22.877 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.064) 0:00:22.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.035) 0:00:22.977 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "_storage_test_pool_member_path": "/dev/sdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.039) 0:00:23.016 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.068) 0:00:23.084 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.037) 0:00:23.122 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.036) 0:00:23.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.030) 0:00:23.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.029) 0:00:23.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.030) 0:00:23.248 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.031) 0:00:23.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.037) 0:00:23.316 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.035) 0:00:23.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.030) 0:00:23.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:16:44 +0000 (0:00:00.029) 0:00:23.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.029) 0:00:23.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.030) 0:00:23.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.034) 0:00:23.505 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.067) 0:00:23.573 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.094) 0:00:23.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.042) 0:00:23.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.032) 0:00:23.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.032) 0:00:23.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.028) 0:00:23.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.032) 0:00:23.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.033) 0:00:23.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.030) 0:00:23.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.032) 0:00:23.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.030) 0:00:23.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.030) 0:00:23.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.033) 0:00:24.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.031) 0:00:24.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.031) 0:00:24.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.032) 0:00:24.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.031) 0:00:24.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.032) 0:00:24.184 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.032) 0:00:24.216 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.082) 0:00:24.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:16:45 +0000 (0:00:00.036) 0:00:24.334 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.134) 0:00:24.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.037) 0:00:24.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "abdf14cd-b9b2-40ed-88cb-f95fad19a8bf" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "abdf14cd-b9b2-40ed-88cb-f95fad19a8bf" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.045) 0:00:24.552 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.037) 0:00:24.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.032) 0:00:24.622 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.036) 0:00:24.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.071) 0:00:24.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.031) 0:00:24.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.030) 0:00:24.792 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.029) 0:00:24.821 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.045) 0:00:24.866 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.038) 0:00:24.904 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.036) 0:00:24.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.028) 0:00:24.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.033) 0:00:25.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.042) 0:00:25.045 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:16:46 +0000 (0:00:00.048) 0:00:25.093 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103795.9071214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103795.9071214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17214, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103795.9071214, "nlink": 1, "path": "/dev/mapper/phi-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.392) 0:00:25.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.038) 0:00:25.525 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.034) 0:00:25.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.031) 0:00:25.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.028) 0:00:25.619 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.032) 0:00:25.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.028) 0:00:25.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.033) 0:00:25.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.028) 0:00:25.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.034) 0:00:25.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.031) 0:00:25.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.032) 0:00:25.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.031) 0:00:25.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.033) 0:00:25.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.029) 0:00:25.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.037) 0:00:25.973 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.036) 0:00:26.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.033) 0:00:26.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.032) 0:00:26.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.036) 0:00:26.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.035) 0:00:26.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.032) 0:00:26.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.032) 0:00:26.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.033) 0:00:26.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.032) 0:00:26.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.037) 0:00:26.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.034) 0:00:26.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:16:47 +0000 (0:00:00.034) 0:00:26.385 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:16:48 +0000 (0:00:00.504) 0:00:26.890 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:16:48 +0000 (0:00:00.352) 0:00:27.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:16:48 +0000 (0:00:00.037) 0:00:27.279 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:16:48 +0000 (0:00:00.033) 0:00:27.313 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:16:48 +0000 (0:00:00.033) 0:00:27.347 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:16:48 +0000 (0:00:00.031) 0:00:27.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:16:48 +0000 (0:00:00.033) 0:00:27.412 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.031) 0:00:27.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.029) 0:00:27.473 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.034) 0:00:27.508 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.036) 0:00:27.544 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.075) 0:00:27.619 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test1" ], "delta": "0:00:00.038997", "end": "2022-06-01 13:16:48.948542", "rc": 0, "start": "2022-06-01 13:16:48.909545" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.405) 0:00:28.025 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.036) 0:00:28.062 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.040) 0:00:28.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.031) 0:00:28.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.031) 0:00:28.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.034) 0:00:28.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.031) 0:00:28.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.030) 0:00:28.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:16:49 +0000 (0:00:00.035) 0:00:28.297 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.130) 0:00:28.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.040) 0:00:28.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6fae22a4-e877-4d99-bc0b-277ce925bf92" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6fae22a4-e877-4d99-bc0b-277ce925bf92" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.047) 0:00:28.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.038) 0:00:28.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.036) 0:00:28.591 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.038) 0:00:28.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.031) 0:00:28.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.029) 0:00:28.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.031) 0:00:28.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.030) 0:00:28.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.045) 0:00:28.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.034) 0:00:28.833 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.035) 0:00:28.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.029) 0:00:28.898 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.033) 0:00:28.932 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.038) 0:00:28.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.038) 0:00:29.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103795.6721215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103795.6721215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17176, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103795.6721215, "nlink": 1, "path": "/dev/mapper/phi-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.373) 0:00:29.382 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:16:50 +0000 (0:00:00.038) 0:00:29.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.037) 0:00:29.459 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.034) 0:00:29.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.029) 0:00:29.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.038) 0:00:29.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.030) 0:00:29.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:29.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:29.655 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.037) 0:00:29.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.030) 0:00:29.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:29.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:29.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:29.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:29.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.038) 0:00:29.887 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.036) 0:00:29.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.033) 0:00:29.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.032) 0:00:29.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.032) 0:00:30.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.032) 0:00:30.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:30.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.032) 0:00:30.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.082) 0:00:30.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:30.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.033) 0:00:30.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.031) 0:00:30.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:16:51 +0000 (0:00:00.032) 0:00:30.330 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.374) 0:00:30.705 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.380) 0:00:31.086 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.039) 0:00:31.125 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.035) 0:00:31.160 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.032) 0:00:31.193 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.030) 0:00:31.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.030) 0:00:31.254 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.035) 0:00:31.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.033) 0:00:31.322 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.034) 0:00:31.357 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:16:52 +0000 (0:00:00.036) 0:00:31.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.044) 0:00:31.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test2" ], "delta": "0:00:00.035634", "end": "2022-06-01 13:16:52.774583", "rc": 0, "start": "2022-06-01 13:16:52.738949" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.412) 0:00:31.850 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.039) 0:00:31.890 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.040) 0:00:31.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.033) 0:00:31.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.031) 0:00:31.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.034) 0:00:32.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.031) 0:00:32.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.030) 0:00:32.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.030) 0:00:32.122 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.028) 0:00:32.150 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:46 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.030) 0:00:32.181 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.058) 0:00:32.239 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:16:53 +0000 (0:00:00.042) 0:00:32.282 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.517) 0:00:32.800 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.072) 0:00:32.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.031) 0:00:32.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.031) 0:00:32.935 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.064) 0:00:33.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.025) 0:00:33.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.029) 0:00:33.055 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "phi", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.037) 0:00:33.093 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.036) 0:00:33.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.038) 0:00:33.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.078) 0:00:33.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.030) 0:00:33.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.030) 0:00:33.308 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.043) 0:00:33.351 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:16:54 +0000 (0:00:00.029) 0:00:33.380 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:16:56 +0000 (0:00:01.670) 0:00:35.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:16:56 +0000 (0:00:00.033) 0:00:35.085 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:16:56 +0000 (0:00:00.031) 0:00:35.117 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/phi-test1", "/dev/mapper/phi-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:16:56 +0000 (0:00:00.043) 0:00:35.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:16:56 +0000 (0:00:00.041) 0:00:35.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:16:56 +0000 (0:00:00.035) 0:00:35.237 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:16:56 +0000 (0:00:00.030) 0:00:35.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:16:57 +0000 (0:00:00.685) 0:00:35.953 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:16:58 +0000 (0:00:00.749) 0:00:36.703 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:16:58 +0000 (0:00:00.691) 0:00:37.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:16:59 +0000 (0:00:00.368) 0:00:37.763 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:16:59 +0000 (0:00:00.032) 0:00:37.796 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:61 Wednesday 01 June 2022 17:17:00 +0000 (0:00:00.866) 0:00:38.663 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:17:00 +0000 (0:00:00.057) 0:00:38.720 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/phi-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/phi-test1", "_raw_device": "/dev/mapper/phi-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/phi-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/phi-test2", "_raw_device": "/dev/mapper/phi-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:17:00 +0000 (0:00:00.090) 0:00:38.811 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:17:00 +0000 (0:00:00.030) 0:00:38.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/phi-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test1", "size": "4G", "type": "lvm", "uuid": "abdf14cd-b9b2-40ed-88cb-f95fad19a8bf" }, "/dev/mapper/phi-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/phi-test2", "size": "4G", "type": "lvm", "uuid": "6fae22a4-e877-4d99-bc0b-277ce925bf92" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qjBzJx-RK4l-Gpre-9j6U-iyWX-Ve33-cHw8wB" }, "/dev/sdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "AzsIl9-5KZU-lqIl-y3N4-RUta-N5eP-gCYDU9" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:17:00 +0000 (0:00:00.387) 0:00:39.229 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002848", "end": "2022-06-01 13:17:00.528849", "rc": 0, "start": "2022-06-01 13:17:00.526001" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/phi-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/phi-test2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:17:01 +0000 (0:00:00.376) 0:00:39.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003093", "end": "2022-06-01 13:17:00.901046", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:17:00.897953" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:17:01 +0000 (0:00:00.376) 0:00:39.983 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:17:01 +0000 (0:00:00.071) 0:00:40.054 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:17:01 +0000 (0:00:00.031) 0:00:40.085 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:17:01 +0000 (0:00:00.077) 0:00:40.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "2", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sdb", "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:17:01 +0000 (0:00:00.043) 0:00:40.206 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb", "pv": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.697) 0:00:40.904 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sdb" } ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sdb", "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 1, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.058) 0:00:40.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.043) 0:00:41.007 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.037) 0:00:41.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.042) 0:00:41.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.034) 0:00:41.121 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb" } MSG: All assertions passed ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.053) 0:00:41.175 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.058) 0:00:41.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.032) 0:00:41.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.037) 0:00:41.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.032) 0:00:41.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.031) 0:00:41.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:17:02 +0000 (0:00:00.032) 0:00:41.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.030) 0:00:41.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.035) 0:00:41.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.031) 0:00:41.498 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.058) 0:00:41.557 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.073) 0:00:41.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.031) 0:00:41.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.029) 0:00:41.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.030) 0:00:41.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.029) 0:00:41.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.075) 0:00:41.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.031) 0:00:41.859 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.060) 0:00:41.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.035) 0:00:41.955 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sdb) => { "_storage_test_pool_member_path": "/dev/sdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.040) 0:00:41.995 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.078) 0:00:42.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.036) 0:00:42.111 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.035) 0:00:42.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.030) 0:00:42.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.032) 0:00:42.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.031) 0:00:42.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.031) 0:00:42.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.035) 0:00:42.308 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.036) 0:00:42.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.031) 0:00:42.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:17:03 +0000 (0:00:00.032) 0:00:42.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.031) 0:00:42.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.032) 0:00:42.472 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.030) 0:00:42.503 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.073) 0:00:42.576 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.079) 0:00:42.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.033) 0:00:42.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.035) 0:00:42.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.032) 0:00:42.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.032) 0:00:42.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.031) 0:00:42.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.032) 0:00:42.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.030) 0:00:42.885 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.035) 0:00:42.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.030) 0:00:42.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.029) 0:00:42.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.029) 0:00:43.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.031) 0:00:43.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.030) 0:00:43.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.031) 0:00:43.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.030) 0:00:43.134 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.031) 0:00:43.166 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.031) 0:00:43.197 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.074) 0:00:43.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:17:04 +0000 (0:00:00.039) 0:00:43.311 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.125) 0:00:43.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.036) 0:00:43.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "abdf14cd-b9b2-40ed-88cb-f95fad19a8bf" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "abdf14cd-b9b2-40ed-88cb-f95fad19a8bf" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.042) 0:00:43.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.040) 0:00:43.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.037) 0:00:43.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.039) 0:00:43.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.031) 0:00:43.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.035) 0:00:43.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.031) 0:00:43.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.035) 0:00:43.768 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.095) 0:00:43.863 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.037) 0:00:43.900 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.037) 0:00:43.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.029) 0:00:43.967 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.031) 0:00:43.999 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.040) 0:00:44.039 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:17:05 +0000 (0:00:00.036) 0:00:44.075 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103795.9071214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103795.9071214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17214, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103795.9071214, "nlink": 1, "path": "/dev/mapper/phi-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.409) 0:00:44.485 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.039) 0:00:44.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.037) 0:00:44.562 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.044) 0:00:44.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.039) 0:00:44.646 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.042) 0:00:44.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.034) 0:00:44.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.031) 0:00:44.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.033) 0:00:44.787 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.040) 0:00:44.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.031) 0:00:44.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.034) 0:00:44.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.032) 0:00:44.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.032) 0:00:44.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.032) 0:00:44.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.039) 0:00:45.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.044) 0:00:45.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.034) 0:00:45.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.032) 0:00:45.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.032) 0:00:45.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.034) 0:00:45.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.033) 0:00:45.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.031) 0:00:45.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.037) 0:00:45.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.030) 0:00:45.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.029) 0:00:45.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:17:06 +0000 (0:00:00.029) 0:00:45.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.029) 0:00:45.431 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.390) 0:00:45.821 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.393) 0:00:46.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.040) 0:00:46.255 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.042) 0:00:46.297 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.039) 0:00:46.337 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.032) 0:00:46.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:17:07 +0000 (0:00:00.032) 0:00:46.402 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.032) 0:00:46.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.031) 0:00:46.466 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.038) 0:00:46.505 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.032) 0:00:46.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.037) 0:00:46.574 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test1" ], "delta": "0:00:00.040673", "end": "2022-06-01 13:17:07.927019", "rc": 0, "start": "2022-06-01 13:17:07.886346" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.430) 0:00:47.005 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.038) 0:00:47.044 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.043) 0:00:47.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.037) 0:00:47.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.034) 0:00:47.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.038) 0:00:47.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.035) 0:00:47.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.034) 0:00:47.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:17:08 +0000 (0:00:00.037) 0:00:47.305 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.131) 0:00:47.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/phi-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.039) 0:00:47.477 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6fae22a4-e877-4d99-bc0b-277ce925bf92" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/phi-test2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "6fae22a4-e877-4d99-bc0b-277ce925bf92" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.046) 0:00:47.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.038) 0:00:47.563 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.036) 0:00:47.599 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.037) 0:00:47.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.028) 0:00:47.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.030) 0:00:47.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.029) 0:00:47.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.029) 0:00:47.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/phi-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.047) 0:00:47.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.035) 0:00:47.839 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.036) 0:00:47.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.032) 0:00:47.908 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.031) 0:00:47.939 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.036) 0:00:47.976 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.036) 0:00:48.013 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103795.6721215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103795.6721215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17176, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103795.6721215, "nlink": 1, "path": "/dev/mapper/phi-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:17:09 +0000 (0:00:00.393) 0:00:48.406 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.036) 0:00:48.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.036) 0:00:48.479 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.034) 0:00:48.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:48.545 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.034) 0:00:48.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.030) 0:00:48.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.030) 0:00:48.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:48.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.039) 0:00:48.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:48.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.029) 0:00:48.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.030) 0:00:48.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.028) 0:00:48.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.027) 0:00:48.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.040) 0:00:48.900 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.036) 0:00:48.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.029) 0:00:48.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.030) 0:00:48.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.030) 0:00:49.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.032) 0:00:49.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.032) 0:00:49.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:49.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:49.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:49.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:49.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.031) 0:00:49.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:17:10 +0000 (0:00:00.084) 0:00:49.333 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.377) 0:00:49.711 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.370) 0:00:50.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.040) 0:00:50.122 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.036) 0:00:50.159 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.031) 0:00:50.190 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.033) 0:00:50.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.033) 0:00:50.257 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.032) 0:00:50.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.036) 0:00:50.327 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.037) 0:00:50.364 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:17:11 +0000 (0:00:00.035) 0:00:50.400 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.040) 0:00:50.440 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "phi/test2" ], "delta": "0:00:00.035290", "end": "2022-06-01 13:17:11.784658", "rc": 0, "start": "2022-06-01 13:17:11.749368" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.422) 0:00:50.862 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.039) 0:00:50.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.041) 0:00:50.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.040) 0:00:50.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.034) 0:00:51.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.032) 0:00:51.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.034) 0:00:51.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.033) 0:00:51.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.031) 0:00:51.151 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.030) 0:00:51.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:63 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.031) 0:00:51.214 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.082) 0:00:51.296 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:17:12 +0000 (0:00:00.050) 0:00:51.347 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.515) 0:00:51.862 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.073) 0:00:51.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.033) 0:00:51.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.032) 0:00:52.002 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.064) 0:00:52.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.026) 0:00:52.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.032) 0:00:52.126 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "phi", "state": "absent", "volumes": [] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.035) 0:00:52.161 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.032) 0:00:52.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.031) 0:00:52.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.030) 0:00:52.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.029) 0:00:52.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.075) 0:00:52.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:17:13 +0000 (0:00:00.047) 0:00:52.408 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:17:14 +0000 (0:00:00.029) 0:00:52.437 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/phi", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:17:16 +0000 (0:00:02.519) 0:00:54.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:17:16 +0000 (0:00:00.032) 0:00:54.989 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:17:16 +0000 (0:00:00.029) 0:00:55.019 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/phi-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/phi-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/phi-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/phi", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:17:16 +0000 (0:00:00.047) 0:00:55.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:17:16 +0000 (0:00:00.037) 0:00:55.104 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:17:16 +0000 (0:00:00.038) 0:00:55.143 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/phi-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/phi-test2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/phi-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/phi-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:17:17 +0000 (0:00:00.747) 0:00:55.890 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:17:18 +0000 (0:00:00.664) 0:00:56.555 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:17:18 +0000 (0:00:00.029) 0:00:56.584 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:17:18 +0000 (0:00:00.627) 0:00:57.212 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:17:19 +0000 (0:00:00.402) 0:00:57.614 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:17:19 +0000 (0:00:00.031) 0:00:57.645 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:73 Wednesday 01 June 2022 17:17:20 +0000 (0:00:00.825) 0:00:58.471 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:17:20 +0000 (0:00:00.058) 0:00:58.529 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "phi", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:17:20 +0000 (0:00:00.037) 0:00:58.566 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:17:20 +0000 (0:00:00.028) 0:00:58.595 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:17:20 +0000 (0:00:00.404) 0:00:59.000 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002999", "end": "2022-06-01 13:17:20.309817", "rc": 0, "start": "2022-06-01 13:17:20.306818" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:17:20 +0000 (0:00:00.386) 0:00:59.387 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002881", "end": "2022-06-01 13:17:20.679368", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:17:20.676487" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.369) 0:00:59.756 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.062) 0:00:59.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.034) 0:00:59.853 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.064) 0:00:59.918 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.082) 0:01:00.001 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.029) 0:01:00.030 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.028) 0:01:00.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.037) 0:01:00.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.036) 0:01:00.134 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.037) 0:01:00.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.032) 0:01:00.204 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.028) 0:01:00.232 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.054) 0:01:00.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.030) 0:01:00.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.032) 0:01:00.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.031) 0:01:00.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:17:21 +0000 (0:00:00.029) 0:01:00.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.029) 0:01:00.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.030) 0:01:00.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.029) 0:01:00.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.033) 0:01:00.534 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.056) 0:01:00.591 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.028) 0:01:00.619 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.066) 0:01:00.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.033) 0:01:00.719 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.027) 0:01:00.746 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.027) 0:01:00.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.030) 0:01:00.805 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.065) 0:01:00.871 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.028) 0:01:00.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.031) 0:01:00.931 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.033) 0:01:00.964 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.034) 0:01:00.999 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.029) 0:01:01.028 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=354 changed=4 unreachable=0 failed=0 skipped=266 rescued=0 ignored=0 Wednesday 01 June 2022 17:17:22 +0000 (0:00:00.016) 0:01:01.045 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : Update facts ------------------------------- 1.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 1.06s /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes_scsi_generated.yml:3 linux-system-roles.storage : make sure blivet is available -------------- 1.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : set up new/current mounts ------------------ 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.81s /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml:2 -------- Get the canonical device path for each member device -------------------- 0.79s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Update facts ------------------------------- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : set up new/current mounts ------------------ 0.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : remove obsolete mounts --------------------- 0.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Get the canonical device path for each member device -------------------- 0.70s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:17:23 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:17:24 +0000 (0:00:01.283) 0:00:01.307 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_one_disk_multiple_volumes.yml ****************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:2 Wednesday 01 June 2022 17:17:24 +0000 (0:00:00.016) 0:00:01.324 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:11 Wednesday 01 June 2022 17:17:25 +0000 (0:00:01.079) 0:00:02.403 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:17:25 +0000 (0:00:00.040) 0:00:02.443 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:17:25 +0000 (0:00:00.159) 0:00:02.603 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:17:26 +0000 (0:00:00.530) 0:00:03.133 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:17:26 +0000 (0:00:00.076) 0:00:03.209 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:17:26 +0000 (0:00:00.023) 0:00:03.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:17:26 +0000 (0:00:00.024) 0:00:03.257 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:17:26 +0000 (0:00:00.192) 0:00:03.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:17:26 +0000 (0:00:00.020) 0:00:03.470 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:17:27 +0000 (0:00:01.057) 0:00:04.528 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:17:27 +0000 (0:00:00.047) 0:00:04.576 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:17:27 +0000 (0:00:00.046) 0:00:04.622 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:17:28 +0000 (0:00:00.658) 0:00:05.280 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:17:28 +0000 (0:00:00.080) 0:00:05.361 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:17:28 +0000 (0:00:00.021) 0:00:05.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:17:28 +0000 (0:00:00.022) 0:00:05.405 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:17:28 +0000 (0:00:00.020) 0:00:05.425 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:17:29 +0000 (0:00:00.817) 0:00:06.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:17:31 +0000 (0:00:01.814) 0:00:08.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:17:31 +0000 (0:00:00.042) 0:00:08.100 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:17:31 +0000 (0:00:00.027) 0:00:08.128 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.515) 0:00:08.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.057) 0:00:08.700 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.026) 0:00:08.727 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.032) 0:00:08.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.030) 0:00:08.790 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.030) 0:00:08.821 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.027) 0:00:08.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.028) 0:00:08.877 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.027) 0:00:08.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.027) 0:00:08.932 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.488) 0:00:09.420 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:17:32 +0000 (0:00:00.029) 0:00:09.450 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:14 Wednesday 01 June 2022 17:17:33 +0000 (0:00:00.837) 0:00:10.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:21 Wednesday 01 June 2022 17:17:33 +0000 (0:00:00.030) 0:00:10.318 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:17:33 +0000 (0:00:00.043) 0:00:10.361 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:17:34 +0000 (0:00:00.520) 0:00:10.882 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:17:34 +0000 (0:00:00.036) 0:00:10.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:17:34 +0000 (0:00:00.031) 0:00:10.950 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create three LVM logical volumes under one volume group] ***************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:26 Wednesday 01 June 2022 17:17:34 +0000 (0:00:00.032) 0:00:10.983 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:17:34 +0000 (0:00:00.054) 0:00:11.038 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:17:34 +0000 (0:00:00.042) 0:00:11.080 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:17:34 +0000 (0:00:00.495) 0:00:11.576 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.067) 0:00:11.644 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.029) 0:00:11.674 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.029) 0:00:11.703 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.057) 0:00:11.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.022) 0:00:11.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.026) 0:00:11.810 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.037) 0:00:11.848 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.074) 0:00:11.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.030) 0:00:11.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.028) 0:00:11.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.028) 0:00:12.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.028) 0:00:12.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.040) 0:00:12.079 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:17:35 +0000 (0:00:00.027) 0:00:12.107 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:17:37 +0000 (0:00:02.245) 0:00:14.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:17:37 +0000 (0:00:00.030) 0:00:14.384 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:17:37 +0000 (0:00:00.027) 0:00:14.411 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:17:37 +0000 (0:00:00.045) 0:00:14.457 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:17:37 +0000 (0:00:00.046) 0:00:14.504 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:17:37 +0000 (0:00:00.047) 0:00:14.552 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:17:37 +0000 (0:00:00.034) 0:00:14.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:17:38 +0000 (0:00:00.917) 0:00:15.504 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:17:40 +0000 (0:00:01.308) 0:00:16.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:17:40 +0000 (0:00:00.652) 0:00:17.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:17:41 +0000 (0:00:00.373) 0:00:17.837 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:17:41 +0000 (0:00:00.029) 0:00:17.867 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:44 Wednesday 01 June 2022 17:17:42 +0000 (0:00:01.064) 0:00:18.931 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:17:42 +0000 (0:00:00.054) 0:00:18.986 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:17:42 +0000 (0:00:00.102) 0:00:19.088 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:17:42 +0000 (0:00:00.031) 0:00:19.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" }, "/dev/mapper/foo-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test2", "size": "3G", "type": "lvm", "uuid": "e34da03d-d928-426f-b106-3fccd9f285b5" }, "/dev/mapper/foo-test3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test3", "size": "3G", "type": "lvm", "uuid": "1b2626e4-5f79-4fc7-8f10-b4491ed5902e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "03U2cv-5u0P-6DD5-Dxdc-gOPQ-C4oo-93yqJC" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:17:42 +0000 (0:00:00.486) 0:00:19.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002801", "end": "2022-06-01 13:17:42.795609", "rc": 0, "start": "2022-06-01 13:17:42.792808" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 xfs defaults 0 0 /dev/mapper/foo-test3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:17:43 +0000 (0:00:00.470) 0:00:20.077 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002468", "end": "2022-06-01 13:17:43.158352", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:17:43.155884" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:17:43 +0000 (0:00:00.358) 0:00:20.435 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:17:43 +0000 (0:00:00.073) 0:00:20.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:17:43 +0000 (0:00:00.030) 0:00:20.539 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:17:43 +0000 (0:00:00.067) 0:00:20.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.040) 0:00:20.647 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.514) 0:00:21.162 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.042) 0:00:21.205 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.038) 0:00:21.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.035) 0:00:21.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.037) 0:00:21.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.030) 0:00:21.346 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.041) 0:00:21.388 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.057) 0:00:21.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.032) 0:00:21.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.031) 0:00:21.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.031) 0:00:21.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.030) 0:00:21.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:17:44 +0000 (0:00:00.029) 0:00:21.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.028) 0:00:21.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.030) 0:00:21.660 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.028) 0:00:21.689 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.058) 0:00:21.747 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.083) 0:00:21.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.029) 0:00:21.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.028) 0:00:21.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.028) 0:00:21.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.031) 0:00:21.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.064) 0:00:22.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.029) 0:00:22.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.030) 0:00:22.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.028) 0:00:22.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.030) 0:00:22.133 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.064) 0:00:22.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.036) 0:00:22.233 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.036) 0:00:22.270 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.060) 0:00:22.330 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.035) 0:00:22.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.037) 0:00:22.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.031) 0:00:22.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.031) 0:00:22.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.031) 0:00:22.498 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.035) 0:00:22.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:17:45 +0000 (0:00:00.031) 0:00:22.565 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.061) 0:00:22.626 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.092) 0:00:22.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.028) 0:00:22.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.027) 0:00:22.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.034) 0:00:22.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:22.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:22.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:22.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:22.930 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:22.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:22.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:23.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.034) 0:00:23.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.039) 0:00:23.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.033) 0:00:23.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.031) 0:00:23.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.033) 0:00:23.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.030) 0:00:23.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.032) 0:00:23.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.036) 0:00:23.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.033) 0:00:23.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.032) 0:00:23.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.034) 0:00:23.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.032) 0:00:23.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.032) 0:00:23.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.032) 0:00:23.490 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.032) 0:00:23.523 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:17:46 +0000 (0:00:00.086) 0:00:23.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.037) 0:00:23.646 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.128) 0:00:23.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.038) 0:00:23.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.089) 0:00:23.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.037) 0:00:23.940 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.035) 0:00:23.976 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.035) 0:00:24.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.029) 0:00:24.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.031) 0:00:24.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.028) 0:00:24.100 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.028) 0:00:24.129 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.047) 0:00:24.177 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.034) 0:00:24.211 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.038) 0:00:24.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.036) 0:00:24.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.033) 0:00:24.321 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.037) 0:00:24.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:17:47 +0000 (0:00:00.037) 0:00:24.396 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.9981215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.9981215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17477, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.9981215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.375) 0:00:24.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.037) 0:00:24.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.036) 0:00:24.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.034) 0:00:24.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.030) 0:00:24.910 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.034) 0:00:24.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.029) 0:00:24.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.030) 0:00:25.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.030) 0:00:25.035 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.041) 0:00:25.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.031) 0:00:25.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.030) 0:00:25.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.030) 0:00:25.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.029) 0:00:25.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.030) 0:00:25.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.040) 0:00:25.269 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.039) 0:00:25.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.032) 0:00:25.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.031) 0:00:25.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.029) 0:00:25.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.030) 0:00:25.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.033) 0:00:25.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.032) 0:00:25.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.032) 0:00:25.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.032) 0:00:25.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:17:48 +0000 (0:00:00.032) 0:00:25.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:17:49 +0000 (0:00:00.032) 0:00:25.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:17:49 +0000 (0:00:00.036) 0:00:25.664 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:17:49 +0000 (0:00:00.499) 0:00:26.163 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:17:49 +0000 (0:00:00.372) 0:00:26.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:17:49 +0000 (0:00:00.038) 0:00:26.574 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:17:49 +0000 (0:00:00.033) 0:00:26.608 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.031) 0:00:26.640 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.031) 0:00:26.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.034) 0:00:26.706 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.068) 0:00:26.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.032) 0:00:26.806 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.035) 0:00:26.842 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.032) 0:00:26.874 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.036) 0:00:26.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036851", "end": "2022-06-01 13:17:50.039670", "rc": 0, "start": "2022-06-01 13:17:50.002819" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.409) 0:00:27.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.040) 0:00:27.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.040) 0:00:27.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.032) 0:00:27.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.032) 0:00:27.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.031) 0:00:27.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.033) 0:00:27.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.032) 0:00:27.563 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:17:50 +0000 (0:00:00.039) 0:00:27.602 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.123) 0:00:27.725 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.036) 0:00:27.762 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e34da03d-d928-426f-b106-3fccd9f285b5" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e34da03d-d928-426f-b106-3fccd9f285b5" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.047) 0:00:27.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.038) 0:00:27.848 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.038) 0:00:27.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.037) 0:00:27.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.031) 0:00:27.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.030) 0:00:27.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.030) 0:00:28.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.042) 0:00:28.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.062) 0:00:28.122 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.036) 0:00:28.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.041) 0:00:28.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.031) 0:00:28.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.032) 0:00:28.263 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.039) 0:00:28.303 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:17:51 +0000 (0:00:00.042) 0:00:28.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.7701216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.7701216, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17444, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.7701216, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.369) 0:00:28.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.039) 0:00:28.755 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.036) 0:00:28.792 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.034) 0:00:28.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.030) 0:00:28.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.036) 0:00:28.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.033) 0:00:28.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.030) 0:00:28.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.030) 0:00:28.988 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.037) 0:00:29.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.031) 0:00:29.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.032) 0:00:29.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.035) 0:00:29.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.037) 0:00:29.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.033) 0:00:29.196 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.039) 0:00:29.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.038) 0:00:29.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.034) 0:00:29.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.042) 0:00:29.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.081) 0:00:29.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.033) 0:00:29.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.037) 0:00:29.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.035) 0:00:29.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.030) 0:00:29.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:17:52 +0000 (0:00:00.031) 0:00:29.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.034) 0:00:29.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.031) 0:00:29.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.032) 0:00:29.700 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.374) 0:00:30.075 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.369) 0:00:30.444 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.038) 0:00:30.482 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.035) 0:00:30.517 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.033) 0:00:30.550 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.032) 0:00:30.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:17:53 +0000 (0:00:00.033) 0:00:30.617 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.031) 0:00:30.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.031) 0:00:30.680 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.033) 0:00:30.713 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.032) 0:00:30.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.036) 0:00:30.782 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.027557", "end": "2022-06-01 13:17:53.908599", "rc": 0, "start": "2022-06-01 13:17:53.881042" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.406) 0:00:31.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.040) 0:00:31.229 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.039) 0:00:31.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.033) 0:00:31.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.032) 0:00:31.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.032) 0:00:31.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.032) 0:00:31.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.033) 0:00:31.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.034) 0:00:31.469 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.115) 0:00:31.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:17:54 +0000 (0:00:00.033) 0:00:31.619 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "1b2626e4-5f79-4fc7-8f10-b4491ed5902e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "1b2626e4-5f79-4fc7-8f10-b4491ed5902e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.041) 0:00:31.660 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.038) 0:00:31.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.033) 0:00:31.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.035) 0:00:31.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.031) 0:00:31.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.029) 0:00:31.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.031) 0:00:31.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.038) 0:00:31.898 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.048) 0:00:31.947 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.035) 0:00:31.982 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.036) 0:00:32.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.031) 0:00:32.050 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.031) 0:00:32.081 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.076) 0:00:32.157 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.036) 0:00:32.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.5431216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.5431216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17410, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.5431216, "nlink": 1, "path": "/dev/mapper/foo-test3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.387) 0:00:32.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:17:55 +0000 (0:00:00.036) 0:00:32.618 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.036) 0:00:32.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.036) 0:00:32.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.034) 0:00:32.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.041) 0:00:32.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.032) 0:00:32.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.031) 0:00:32.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.031) 0:00:32.863 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.037) 0:00:32.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.031) 0:00:32.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.033) 0:00:32.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.032) 0:00:32.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.030) 0:00:33.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.031) 0:00:33.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.038) 0:00:33.099 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.036) 0:00:33.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.034) 0:00:33.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.030) 0:00:33.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.033) 0:00:33.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.033) 0:00:33.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.035) 0:00:33.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.032) 0:00:33.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.034) 0:00:33.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.033) 0:00:33.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.033) 0:00:33.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.030) 0:00:33.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:17:56 +0000 (0:00:00.029) 0:00:33.496 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.382) 0:00:33.879 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.377) 0:00:34.256 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.038) 0:00:34.295 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.037) 0:00:34.332 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.031) 0:00:34.363 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.030) 0:00:34.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.032) 0:00:34.426 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.032) 0:00:34.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.035) 0:00:34.493 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.043) 0:00:34.537 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.034) 0:00:34.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:17:57 +0000 (0:00:00.040) 0:00:34.612 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test3" ], "delta": "0:00:00.035813", "end": "2022-06-01 13:17:57.757943", "rc": 0, "start": "2022-06-01 13:17:57.722130" } STDOUT: LVM2_LV_NAME=test3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.429) 0:00:35.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.039) 0:00:35.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.040) 0:00:35.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.033) 0:00:35.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.043) 0:00:35.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.035) 0:00:35.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.035) 0:00:35.271 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.032) 0:00:35.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.032) 0:00:35.336 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.028) 0:00:35.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:46 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.082) 0:00:35.447 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.061) 0:00:35.508 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:17:58 +0000 (0:00:00.045) 0:00:35.554 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.526) 0:00:36.081 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.072) 0:00:36.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.035) 0:00:36.188 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.033) 0:00:36.222 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.065) 0:00:36.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.026) 0:00:36.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.031) 0:00:36.345 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.036) 0:00:36.382 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.035) 0:00:36.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.030) 0:00:36.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.031) 0:00:36.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.031) 0:00:36.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.031) 0:00:36.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.046) 0:00:36.589 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:17:59 +0000 (0:00:00.032) 0:00:36.621 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:18:01 +0000 (0:00:01.645) 0:00:38.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:18:01 +0000 (0:00:00.032) 0:00:38.298 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:18:01 +0000 (0:00:00.029) 0:00:38.328 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:18:01 +0000 (0:00:00.046) 0:00:38.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:18:01 +0000 (0:00:00.043) 0:00:38.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:18:01 +0000 (0:00:00.037) 0:00:38.456 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:18:01 +0000 (0:00:00.029) 0:00:38.485 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:18:02 +0000 (0:00:00.673) 0:00:39.159 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:18:03 +0000 (0:00:01.106) 0:00:40.265 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:18:04 +0000 (0:00:00.670) 0:00:40.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:18:04 +0000 (0:00:00.373) 0:00:41.309 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:18:04 +0000 (0:00:00.031) 0:00:41.341 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:64 Wednesday 01 June 2022 17:18:05 +0000 (0:00:00.889) 0:00:42.230 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:18:05 +0000 (0:00:00.056) 0:00:42.287 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:18:05 +0000 (0:00:00.048) 0:00:42.335 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:18:05 +0000 (0:00:00.031) 0:00:42.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" }, "/dev/mapper/foo-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test2", "size": "3G", "type": "lvm", "uuid": "e34da03d-d928-426f-b106-3fccd9f285b5" }, "/dev/mapper/foo-test3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test3", "size": "3G", "type": "lvm", "uuid": "1b2626e4-5f79-4fc7-8f10-b4491ed5902e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "03U2cv-5u0P-6DD5-Dxdc-gOPQ-C4oo-93yqJC" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:18:06 +0000 (0:00:00.375) 0:00:42.741 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.004988", "end": "2022-06-01 13:18:05.835111", "rc": 0, "start": "2022-06-01 13:18:05.830123" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 xfs defaults 0 0 /dev/mapper/foo-test3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:18:06 +0000 (0:00:00.378) 0:00:43.120 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002782", "end": "2022-06-01 13:18:06.217124", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:18:06.214342" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:18:06 +0000 (0:00:00.380) 0:00:43.501 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:18:06 +0000 (0:00:00.072) 0:00:43.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:18:06 +0000 (0:00:00.032) 0:00:43.606 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.062) 0:00:43.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.039) 0:00:43.708 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.383) 0:00:44.091 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.043) 0:00:44.134 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.038) 0:00:44.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.038) 0:00:44.211 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.037) 0:00:44.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.033) 0:00:44.282 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.044) 0:00:44.326 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.092) 0:00:44.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.032) 0:00:44.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.037) 0:00:44.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.035) 0:00:44.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.031) 0:00:44.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.031) 0:00:44.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:18:07 +0000 (0:00:00.033) 0:00:44.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.031) 0:00:44.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.031) 0:00:44.684 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.060) 0:00:44.744 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.089) 0:00:44.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.030) 0:00:44.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.030) 0:00:44.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.042) 0:00:44.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.035) 0:00:44.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.032) 0:00:45.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.030) 0:00:45.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.031) 0:00:45.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.031) 0:00:45.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.034) 0:00:45.134 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.062) 0:00:45.196 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.037) 0:00:45.234 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.035) 0:00:45.269 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.060) 0:00:45.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.035) 0:00:45.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.036) 0:00:45.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.031) 0:00:45.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.031) 0:00:45.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.040) 0:00:45.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.034) 0:00:45.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:18:08 +0000 (0:00:00.030) 0:00:45.571 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.064) 0:00:45.635 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.104) 0:00:45.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:45.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.034) 0:00:45.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:45.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.030) 0:00:45.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:45.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:45.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.031) 0:00:45.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.035) 0:00:46.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.031) 0:00:46.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.030) 0:00:46.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.030) 0:00:46.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.031) 0:00:46.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.031) 0:00:46.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:46.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.031) 0:00:46.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:46.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.030) 0:00:46.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.031) 0:00:46.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.030) 0:00:46.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:46.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.081) 0:00:46.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.032) 0:00:46.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.031) 0:00:46.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.039) 0:00:46.563 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:18:09 +0000 (0:00:00.036) 0:00:46.600 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.092) 0:00:46.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.038) 0:00:46.731 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.121) 0:00:46.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.036) 0:00:46.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.046) 0:00:46.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.039) 0:00:46.975 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.037) 0:00:47.012 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.038) 0:00:47.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.030) 0:00:47.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.034) 0:00:47.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.030) 0:00:47.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.032) 0:00:47.180 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.048) 0:00:47.229 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.037) 0:00:47.266 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.038) 0:00:47.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.033) 0:00:47.339 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.032) 0:00:47.371 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.038) 0:00:47.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:10 +0000 (0:00:00.037) 0:00:47.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.9981215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.9981215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17477, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.9981215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.381) 0:00:47.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.039) 0:00:47.868 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.036) 0:00:47.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.035) 0:00:47.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.031) 0:00:47.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.036) 0:00:48.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.030) 0:00:48.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.030) 0:00:48.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.031) 0:00:48.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.042) 0:00:48.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.031) 0:00:48.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.032) 0:00:48.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.032) 0:00:48.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.031) 0:00:48.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.032) 0:00:48.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.041) 0:00:48.345 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.037) 0:00:48.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.031) 0:00:48.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.030) 0:00:48.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.030) 0:00:48.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.031) 0:00:48.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.032) 0:00:48.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:11 +0000 (0:00:00.040) 0:00:48.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:12 +0000 (0:00:00.043) 0:00:48.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:12 +0000 (0:00:00.044) 0:00:48.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:12 +0000 (0:00:00.037) 0:00:48.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:12 +0000 (0:00:00.032) 0:00:48.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:12 +0000 (0:00:00.101) 0:00:48.841 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:12 +0000 (0:00:00.404) 0:00:49.246 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.412) 0:00:49.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.039) 0:00:49.698 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.036) 0:00:49.734 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.033) 0:00:49.768 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.033) 0:00:49.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.029) 0:00:49.830 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.031) 0:00:49.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.028) 0:00:49.890 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.032) 0:00:49.922 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.031) 0:00:49.954 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.037) 0:00:49.991 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.045592", "end": "2022-06-01 13:18:13.135721", "rc": 0, "start": "2022-06-01 13:18:13.090129" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.428) 0:00:50.420 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.038) 0:00:50.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.037) 0:00:50.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.029) 0:00:50.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.034) 0:00:50.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:13 +0000 (0:00:00.033) 0:00:50.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.033) 0:00:50.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.032) 0:00:50.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.037) 0:00:50.697 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.118) 0:00:50.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.036) 0:00:50.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e34da03d-d928-426f-b106-3fccd9f285b5" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e34da03d-d928-426f-b106-3fccd9f285b5" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.043) 0:00:50.895 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.040) 0:00:50.936 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.038) 0:00:50.974 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.039) 0:00:51.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.030) 0:00:51.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.030) 0:00:51.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.029) 0:00:51.105 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.032) 0:00:51.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.049) 0:00:51.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.036) 0:00:51.223 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.037) 0:00:51.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.033) 0:00:51.295 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.032) 0:00:51.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.037) 0:00:51.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:14 +0000 (0:00:00.039) 0:00:51.405 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.7701216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.7701216, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17444, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.7701216, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.389) 0:00:51.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.075) 0:00:51.870 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.037) 0:00:51.907 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.035) 0:00:51.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.030) 0:00:51.973 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.036) 0:00:52.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.032) 0:00:52.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.033) 0:00:52.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.030) 0:00:52.106 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.037) 0:00:52.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.030) 0:00:52.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.030) 0:00:52.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.032) 0:00:52.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.033) 0:00:52.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.033) 0:00:52.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.039) 0:00:52.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.035) 0:00:52.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.030) 0:00:52.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.030) 0:00:52.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.034) 0:00:52.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.032) 0:00:52.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.031) 0:00:52.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.032) 0:00:52.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:15 +0000 (0:00:00.032) 0:00:52.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.035) 0:00:52.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.036) 0:00:52.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.032) 0:00:52.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.033) 0:00:52.742 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.364) 0:00:53.106 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.389) 0:00:53.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.039) 0:00:53.535 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.034) 0:00:53.569 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:16 +0000 (0:00:00.032) 0:00:53.602 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.033) 0:00:53.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.031) 0:00:53.666 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.035) 0:00:53.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.035) 0:00:53.738 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.038) 0:00:53.776 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.034) 0:00:53.811 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.042) 0:00:53.853 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.034518", "end": "2022-06-01 13:18:16.985630", "rc": 0, "start": "2022-06-01 13:18:16.951112" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.417) 0:00:54.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.041) 0:00:54.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.039) 0:00:54.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.034) 0:00:54.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.034) 0:00:54.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.032) 0:00:54.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.041) 0:00:54.495 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.035) 0:00:54.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:17 +0000 (0:00:00.037) 0:00:54.568 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.174) 0:00:54.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.038) 0:00:54.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "1b2626e4-5f79-4fc7-8f10-b4491ed5902e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "1b2626e4-5f79-4fc7-8f10-b4491ed5902e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.043) 0:00:54.825 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.040) 0:00:54.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.036) 0:00:54.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.038) 0:00:54.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.029) 0:00:54.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.030) 0:00:55.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.030) 0:00:55.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.035) 0:00:55.066 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.048) 0:00:55.114 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.035) 0:00:55.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.036) 0:00:55.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.030) 0:00:55.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.032) 0:00:55.249 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.040) 0:00:55.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:18 +0000 (0:00:00.037) 0:00:55.328 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.5431216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.5431216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17410, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.5431216, "nlink": 1, "path": "/dev/mapper/foo-test3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.388) 0:00:55.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.038) 0:00:55.754 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.037) 0:00:55.791 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.033) 0:00:55.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.030) 0:00:55.855 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.036) 0:00:55.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.029) 0:00:55.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.030) 0:00:55.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.029) 0:00:55.981 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.036) 0:00:56.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.030) 0:00:56.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.032) 0:00:56.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.030) 0:00:56.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.031) 0:00:56.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.030) 0:00:56.173 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.037) 0:00:56.211 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.035) 0:00:56.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.033) 0:00:56.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.033) 0:00:56.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.030) 0:00:56.345 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.033) 0:00:56.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.031) 0:00:56.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.030) 0:00:56.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.033) 0:00:56.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.031) 0:00:56.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.032) 0:00:56.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.031) 0:00:56.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:19 +0000 (0:00:00.031) 0:00:56.602 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.376) 0:00:56.979 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.376) 0:00:57.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.039) 0:00:57.395 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.034) 0:00:57.430 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.071) 0:00:57.502 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.032) 0:00:57.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.032) 0:00:57.567 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:20 +0000 (0:00:00.032) 0:00:57.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.031) 0:00:57.630 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.035) 0:00:57.666 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.037) 0:00:57.703 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.040) 0:00:57.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test3" ], "delta": "0:00:00.037850", "end": "2022-06-01 13:18:20.870903", "rc": 0, "start": "2022-06-01 13:18:20.833053" } STDOUT: LVM2_LV_NAME=test3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.409) 0:00:58.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.040) 0:00:58.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.041) 0:00:58.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.033) 0:00:58.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.035) 0:00:58.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.036) 0:00:58.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.034) 0:00:58.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.031) 0:00:58.407 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.031) 0:00:58.438 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.028) 0:00:58.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove two of the LVs] *************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:66 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.031) 0:00:58.498 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.077) 0:00:58.575 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:18:21 +0000 (0:00:00.047) 0:00:58.623 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.505) 0:00:59.128 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.076) 0:00:59.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.039) 0:00:59.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.035) 0:00:59.280 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.072) 0:00:59.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.028) 0:00:59.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.033) 0:00:59.414 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g", "state": "absent" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.039) 0:00:59.453 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.034) 0:00:59.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.041) 0:00:59.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.030) 0:00:59.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:18:22 +0000 (0:00:00.033) 0:00:59.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:18:23 +0000 (0:00:00.033) 0:00:59.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:18:23 +0000 (0:00:00.048) 0:00:59.675 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:18:23 +0000 (0:00:00.032) 0:00:59.708 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:18:25 +0000 (0:00:02.253) 0:01:01.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:18:25 +0000 (0:00:00.032) 0:01:01.994 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:18:25 +0000 (0:00:00.031) 0:01:02.025 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:18:25 +0000 (0:00:00.050) 0:01:02.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:18:25 +0000 (0:00:00.053) 0:01:02.130 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:18:25 +0000 (0:00:00.038) 0:01:02.168 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test3', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "absent" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test3" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:18:26 +0000 (0:00:00.722) 0:01:02.891 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:18:26 +0000 (0:00:00.661) 0:01:03.552 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:18:27 +0000 (0:00:00.379) 0:01:03.932 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:18:27 +0000 (0:00:00.660) 0:01:04.593 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:18:28 +0000 (0:00:00.367) 0:01:04.960 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:18:28 +0000 (0:00:00.032) 0:01:04.992 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:86 Wednesday 01 June 2022 17:18:29 +0000 (0:00:00.853) 0:01:05.846 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:18:29 +0000 (0:00:00.061) 0:01:05.907 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:18:29 +0000 (0:00:00.042) 0:01:05.950 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:18:29 +0000 (0:00:00.028) 0:01:05.979 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "03U2cv-5u0P-6DD5-Dxdc-gOPQ-C4oo-93yqJC" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:18:29 +0000 (0:00:00.379) 0:01:06.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002752", "end": "2022-06-01 13:18:29.452483", "rc": 0, "start": "2022-06-01 13:18:29.449731" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:18:30 +0000 (0:00:00.375) 0:01:06.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002667", "end": "2022-06-01 13:18:29.834614", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:18:29.831947" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:18:30 +0000 (0:00:00.380) 0:01:07.114 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:18:30 +0000 (0:00:00.072) 0:01:07.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:18:30 +0000 (0:00:00.030) 0:01:07.218 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:18:30 +0000 (0:00:00.062) 0:01:07.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:18:30 +0000 (0:00:00.039) 0:01:07.320 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.384) 0:01:07.704 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.042) 0:01:07.746 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.037) 0:01:07.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.036) 0:01:07.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.033) 0:01:07.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.030) 0:01:07.885 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.044) 0:01:07.929 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.059) 0:01:07.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.032) 0:01:08.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.031) 0:01:08.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.031) 0:01:08.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.031) 0:01:08.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.031) 0:01:08.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.034) 0:01:08.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.031) 0:01:08.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.030) 0:01:08.242 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.059) 0:01:08.301 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.087) 0:01:08.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.030) 0:01:08.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.031) 0:01:08.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.030) 0:01:08.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.031) 0:01:08.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.030) 0:01:08.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.030) 0:01:08.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:31 +0000 (0:00:00.032) 0:01:08.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.031) 0:01:08.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.029) 0:01:08.668 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.063) 0:01:08.731 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.035) 0:01:08.767 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.034) 0:01:08.801 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.057) 0:01:08.858 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.072) 0:01:08.931 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.037) 0:01:08.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.029) 0:01:08.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.028) 0:01:09.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.029) 0:01:09.056 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.030) 0:01:09.086 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.033) 0:01:09.119 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.069) 0:01:09.189 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.113) 0:01:09.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.035) 0:01:09.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.032) 0:01:09.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.034) 0:01:09.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.033) 0:01:09.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.032) 0:01:09.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.034) 0:01:09.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.035) 0:01:09.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.031) 0:01:09.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:32 +0000 (0:00:00.031) 0:01:09.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.033) 0:01:09.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.031) 0:01:09.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:09.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.031) 0:01:09.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:09.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.031) 0:01:09.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.032) 0:01:09.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.031) 0:01:09.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.031) 0:01:09.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:09.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:09.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:09.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.031) 0:01:10.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:10.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:10.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:10.104 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.081) 0:01:10.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.035) 0:01:10.221 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.126) 0:01:10.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.036) 0:01:10.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.040) 0:01:10.425 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.036) 0:01:10.462 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.033) 0:01:10.495 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.038) 0:01:10.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.030) 0:01:10.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:33 +0000 (0:00:00.031) 0:01:10.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.030) 0:01:10.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.034) 0:01:10.662 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.045) 0:01:10.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.034) 0:01:10.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.035) 0:01:10.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.029) 0:01:10.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.030) 0:01:10.838 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.087) 0:01:10.926 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.039) 0:01:10.965 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.9981215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.9981215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17477, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.9981215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.385) 0:01:11.351 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.038) 0:01:11.389 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.035) 0:01:11.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.032) 0:01:11.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.028) 0:01:11.487 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.035) 0:01:11.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.029) 0:01:11.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.030) 0:01:11.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:34 +0000 (0:00:00.030) 0:01:11.613 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.036) 0:01:11.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.029) 0:01:11.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.031) 0:01:11.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.028) 0:01:11.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.037) 0:01:11.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.035) 0:01:11.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.040) 0:01:11.853 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.040) 0:01:11.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.034) 0:01:11.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.029) 0:01:11.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.031) 0:01:11.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.033) 0:01:12.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.032) 0:01:12.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.031) 0:01:12.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.034) 0:01:12.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.031) 0:01:12.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.031) 0:01:12.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.032) 0:01:12.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.031) 0:01:12.248 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:35 +0000 (0:00:00.370) 0:01:12.618 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.378) 0:01:12.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.035) 0:01:13.032 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.030) 0:01:13.063 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.035) 0:01:13.098 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.034) 0:01:13.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.031) 0:01:13.165 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.040) 0:01:13.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.030) 0:01:13.236 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.040) 0:01:13.276 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.035) 0:01:13.311 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:36 +0000 (0:00:00.041) 0:01:13.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.040402", "end": "2022-06-01 13:18:36.494790", "rc": 0, "start": "2022-06-01 13:18:36.454388" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.425) 0:01:13.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.042) 0:01:13.821 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.038) 0:01:13.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.032) 0:01:13.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.032) 0:01:13.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.084) 0:01:14.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.034) 0:01:14.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.031) 0:01:14.076 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.035) 0:01:14.111 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.123) 0:01:14.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.037) 0:01:14.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.044) 0:01:14.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.031) 0:01:14.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.035) 0:01:14.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.031) 0:01:14.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.031) 0:01:14.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.030) 0:01:14.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.035) 0:01:14.511 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.035) 0:01:14.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.049) 0:01:14.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:37 +0000 (0:00:00.027) 0:01:14.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.035) 0:01:14.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.030) 0:01:14.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.033) 0:01:14.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.033) 0:01:14.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.027) 0:01:14.783 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.368) 0:01:15.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.039) 0:01:15.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.025) 0:01:15.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.033) 0:01:15.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.030) 0:01:15.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.028) 0:01:15.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.030) 0:01:15.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.034) 0:01:15.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.032) 0:01:15.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.026) 0:01:15.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.032) 0:01:15.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.034) 0:01:15.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.032) 0:01:15.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.031) 0:01:15.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:38 +0000 (0:00:00.032) 0:01:15.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.040) 0:01:15.636 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.036) 0:01:15.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.033) 0:01:15.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:15.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:15.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.031) 0:01:15.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.031) 0:01:15.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.031) 0:01:15.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.033) 0:01:15.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.032) 0:01:15.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:15.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:15.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.029) 0:01:16.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.032) 0:01:16.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.113 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.033) 0:01:16.147 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.029) 0:01:16.177 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.031) 0:01:16.239 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.087) 0:01:16.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.033) 0:01:16.360 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.034) 0:01:16.395 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.033) 0:01:16.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.031) 0:01:16.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:39 +0000 (0:00:00.030) 0:01:16.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.030) 0:01:16.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.029) 0:01:16.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.032) 0:01:16.705 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.036) 0:01:16.741 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.117) 0:01:16.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.034) 0:01:16.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.038) 0:01:16.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.029) 0:01:16.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.035) 0:01:16.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.028) 0:01:17.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.030) 0:01:17.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.029) 0:01:17.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.030) 0:01:17.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.036) 0:01:17.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.054) 0:01:17.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.025) 0:01:17.235 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.036) 0:01:17.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.029) 0:01:17.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.030) 0:01:17.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.029) 0:01:17.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:40 +0000 (0:00:00.027) 0:01:17.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.364) 0:01:17.753 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.038) 0:01:17.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.027) 0:01:17.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.034) 0:01:17.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.031) 0:01:17.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.031) 0:01:17.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.032) 0:01:17.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.033) 0:01:17.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.031) 0:01:18.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.026) 0:01:18.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.030) 0:01:18.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.030) 0:01:18.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.030) 0:01:18.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.034) 0:01:18.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.031) 0:01:18.200 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.038) 0:01:18.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.035) 0:01:18.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.030) 0:01:18.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.030) 0:01:18.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.032) 0:01:18.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.029) 0:01:18.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.029) 0:01:18.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.028) 0:01:18.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.027) 0:01:18.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.027) 0:01:18.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:41 +0000 (0:00:00.094) 0:01:18.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.033) 0:01:18.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.032) 0:01:18.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.032) 0:01:18.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:18.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:18.767 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.036) 0:01:18.804 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:18.836 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:18.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:18.899 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.030) 0:01:18.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.035) 0:01:18.965 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.039) 0:01:19.004 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.033) 0:01:19.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.030) 0:01:19.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:19.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:19.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.034) 0:01:19.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.037) 0:01:19.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.033) 0:01:19.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.031) 0:01:19.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.033) 0:01:19.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.033) 0:01:19.336 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.032) 0:01:19.369 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.033) 0:01:19.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Re-run the previous role invocation to ensure idempotence] *************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:88 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.033) 0:01:19.436 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.086) 0:01:19.523 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:18:42 +0000 (0:00:00.047) 0:01:19.571 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.531) 0:01:20.103 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.072) 0:01:20.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.030) 0:01:20.206 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.032) 0:01:20.238 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.061) 0:01:20.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.025) 0:01:20.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.031) 0:01:20.357 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g", "state": "absent" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.037) 0:01:20.394 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.037) 0:01:20.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.033) 0:01:20.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.033) 0:01:20.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.033) 0:01:20.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.034) 0:01:20.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:18:43 +0000 (0:00:00.047) 0:01:20.614 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:18:44 +0000 (0:00:00.030) 0:01:20.645 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:18:45 +0000 (0:00:01.316) 0:01:21.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:18:45 +0000 (0:00:00.032) 0:01:21.994 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:18:45 +0000 (0:00:00.030) 0:01:22.024 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:18:45 +0000 (0:00:00.045) 0:01:22.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:18:45 +0000 (0:00:00.044) 0:01:22.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:18:45 +0000 (0:00:00.034) 0:01:22.148 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:18:45 +0000 (0:00:00.030) 0:01:22.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:18:46 +0000 (0:00:00.650) 0:01:22.829 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:18:46 +0000 (0:00:00.388) 0:01:23.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:18:47 +0000 (0:00:00.671) 0:01:23.889 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:18:47 +0000 (0:00:00.378) 0:01:24.268 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:18:47 +0000 (0:00:00.028) 0:01:24.296 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:108 Wednesday 01 June 2022 17:18:48 +0000 (0:00:00.871) 0:01:25.168 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:18:48 +0000 (0:00:00.063) 0:01:25.231 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:18:48 +0000 (0:00:00.043) 0:01:25.275 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:18:48 +0000 (0:00:00.028) 0:01:25.303 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "03U2cv-5u0P-6DD5-Dxdc-gOPQ-C4oo-93yqJC" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:18:49 +0000 (0:00:00.395) 0:01:25.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002607", "end": "2022-06-01 13:18:48.776346", "rc": 0, "start": "2022-06-01 13:18:48.773739" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:18:49 +0000 (0:00:00.357) 0:01:26.056 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002525", "end": "2022-06-01 13:18:49.153506", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:18:49.150981" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:18:49 +0000 (0:00:00.377) 0:01:26.434 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:18:49 +0000 (0:00:00.117) 0:01:26.551 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:18:49 +0000 (0:00:00.030) 0:01:26.582 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.064) 0:01:26.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.042) 0:01:26.690 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.366) 0:01:27.056 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.042) 0:01:27.099 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.042) 0:01:27.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.040) 0:01:27.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.038) 0:01:27.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.029) 0:01:27.250 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.045) 0:01:27.296 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.057) 0:01:27.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.030) 0:01:27.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.031) 0:01:27.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.033) 0:01:27.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.030) 0:01:27.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.030) 0:01:27.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.030) 0:01:27.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.030) 0:01:27.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:18:50 +0000 (0:00:00.031) 0:01:27.604 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.062) 0:01:27.666 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.084) 0:01:27.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.032) 0:01:27.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.029) 0:01:27.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:27.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:27.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:27.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.029) 0:01:27.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.031) 0:01:27.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:27.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:28.026 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.060) 0:01:28.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.039) 0:01:28.127 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.035) 0:01:28.162 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.057) 0:01:28.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.036) 0:01:28.256 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.036) 0:01:28.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.032) 0:01:28.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.029) 0:01:28.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:28.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:28.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.030) 0:01:28.446 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:18:51 +0000 (0:00:00.063) 0:01:28.510 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.152) 0:01:28.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.033) 0:01:28.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.033) 0:01:28.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:28.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:28.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.029) 0:01:28.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.032) 0:01:28.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:28.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.031) 0:01:28.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:28.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:28.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:29.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.033) 0:01:29.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:29.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:29.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:29.132 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.036) 0:01:29.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.034) 0:01:29.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.034) 0:01:29.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.032) 0:01:29.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.032) 0:01:29.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.030) 0:01:29.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.029) 0:01:29.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.029) 0:01:29.392 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.034) 0:01:29.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.031) 0:01:29.459 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.089) 0:01:29.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:52 +0000 (0:00:00.038) 0:01:29.586 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.119) 0:01:29.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.035) 0:01:29.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "ab72eadd-b75b-44d0-82ef-69dac335e18e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.041) 0:01:29.783 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.037) 0:01:29.821 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.035) 0:01:29.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.039) 0:01:29.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.029) 0:01:29.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.029) 0:01:29.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.029) 0:01:29.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.030) 0:01:30.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.046) 0:01:30.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.036) 0:01:30.097 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.035) 0:01:30.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.029) 0:01:30.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.030) 0:01:30.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.038) 0:01:30.231 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:53 +0000 (0:00:00.038) 0:01:30.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103856.9981215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103856.9981215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17477, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103856.9981215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.379) 0:01:30.649 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.038) 0:01:30.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.036) 0:01:30.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.033) 0:01:30.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.030) 0:01:30.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.034) 0:01:30.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.029) 0:01:30.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.030) 0:01:30.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.071) 0:01:30.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.038) 0:01:30.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.030) 0:01:31.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.030) 0:01:31.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.029) 0:01:31.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.030) 0:01:31.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.032) 0:01:31.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.039) 0:01:31.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.042) 0:01:31.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.032) 0:01:31.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.031) 0:01:31.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.030) 0:01:31.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.035) 0:01:31.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.031) 0:01:31.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.031) 0:01:31.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.029) 0:01:31.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.029) 0:01:31.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.031) 0:01:31.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.032) 0:01:31.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:54 +0000 (0:00:00.030) 0:01:31.575 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.390) 0:01:31.966 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.369) 0:01:32.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.039) 0:01:32.375 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.032) 0:01:32.407 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.029) 0:01:32.437 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.029) 0:01:32.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.030) 0:01:32.498 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.034) 0:01:32.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.031) 0:01:32.564 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:55 +0000 (0:00:00.034) 0:01:32.599 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.033) 0:01:32.633 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.039) 0:01:32.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037377", "end": "2022-06-01 13:18:55.804269", "rc": 0, "start": "2022-06-01 13:18:55.766892" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.415) 0:01:33.088 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.039) 0:01:33.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.041) 0:01:33.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.031) 0:01:33.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.031) 0:01:33.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.034) 0:01:33.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.034) 0:01:33.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.032) 0:01:33.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.038) 0:01:33.371 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.122) 0:01:33.494 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.036) 0:01:33.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.041) 0:01:33.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:56 +0000 (0:00:00.030) 0:01:33.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.036) 0:01:33.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.079) 0:01:33.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.031) 0:01:33.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.030) 0:01:33.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.030) 0:01:33.811 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.031) 0:01:33.843 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.046) 0:01:33.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.029) 0:01:33.919 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.037) 0:01:33.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.030) 0:01:33.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.031) 0:01:34.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.030) 0:01:34.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.026) 0:01:34.075 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.370) 0:01:34.445 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.040) 0:01:34.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.030) 0:01:34.516 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.037) 0:01:34.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.031) 0:01:34.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:18:57 +0000 (0:00:00.027) 0:01:34.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:34.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:34.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.033) 0:01:34.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.027) 0:01:34.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.031) 0:01:34.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.028) 0:01:34.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.028) 0:01:34.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.029) 0:01:34.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.031) 0:01:34.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.040) 0:01:34.924 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.035) 0:01:34.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:34.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.029) 0:01:35.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.029) 0:01:35.048 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.033) 0:01:35.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:35.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.029) 0:01:35.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:35.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.029) 0:01:35.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.031) 0:01:35.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.032) 0:01:35.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:35.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:35.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:35.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.031) 0:01:35.388 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.032) 0:01:35.421 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.032) 0:01:35.454 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:35.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.029) 0:01:35.514 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.030) 0:01:35.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.031) 0:01:35.575 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:18:58 +0000 (0:00:00.035) 0:01:35.611 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.036) 0:01:35.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.029) 0:01:35.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.028) 0:01:35.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.030) 0:01:35.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.029) 0:01:35.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.029) 0:01:35.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.033) 0:01:35.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.030) 0:01:35.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.029) 0:01:35.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.031) 0:01:35.921 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.035) 0:01:35.957 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.152) 0:01:36.110 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.036) 0:01:36.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.039) 0:01:36.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.029) 0:01:36.216 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.039) 0:01:36.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.033) 0:01:36.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.032) 0:01:36.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.033) 0:01:36.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.031) 0:01:36.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.032) 0:01:36.419 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.048) 0:01:36.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.027) 0:01:36.494 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.039) 0:01:36.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.039) 0:01:36.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:18:59 +0000 (0:00:00.034) 0:01:36.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.032) 0:01:36.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.027) 0:01:36.668 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.363) 0:01:37.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.038) 0:01:37.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.027) 0:01:37.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.036) 0:01:37.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.032) 0:01:37.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.027) 0:01:37.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.030) 0:01:37.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.032) 0:01:37.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.031) 0:01:37.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.029) 0:01:37.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.030) 0:01:37.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.030) 0:01:37.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.035) 0:01:37.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.031) 0:01:37.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.030) 0:01:37.476 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.039) 0:01:37.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.036) 0:01:37.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.030) 0:01:37.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:19:00 +0000 (0:00:00.030) 0:01:37.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.029) 0:01:37.643 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.030) 0:01:37.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.033) 0:01:37.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.031) 0:01:37.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.029) 0:01:37.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.029) 0:01:37.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.030) 0:01:37.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.031) 0:01:37.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.033) 0:01:37.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.031) 0:01:37.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.029) 0:01:37.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.030) 0:01:37.984 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.033) 0:01:38.017 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.031) 0:01:38.048 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.033) 0:01:38.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.031) 0:01:38.114 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.032) 0:01:38.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.032) 0:01:38.178 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.033) 0:01:38.212 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.033) 0:01:38.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.032) 0:01:38.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.073) 0:01:38.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.031) 0:01:38.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.030) 0:01:38.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.030) 0:01:38.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.030) 0:01:38.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.029) 0:01:38.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.032) 0:01:38.537 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.032) 0:01:38.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:19:01 +0000 (0:00:00.034) 0:01:38.604 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.029) 0:01:38.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:110 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.031) 0:01:38.665 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.088) 0:01:38.753 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.043) 0:01:38.797 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.560) 0:01:39.357 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.074) 0:01:39.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.032) 0:01:39.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.035) 0:01:39.500 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.069) 0:01:39.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:19:02 +0000 (0:00:00.027) 0:01:39.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.031) 0:01:39.629 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.038) 0:01:39.667 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.033) 0:01:39.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.030) 0:01:39.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.030) 0:01:39.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.031) 0:01:39.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.030) 0:01:39.823 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.046) 0:01:39.870 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:19:03 +0000 (0:00:00.028) 0:01:39.899 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:19:05 +0000 (0:00:01.776) 0:01:41.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:19:05 +0000 (0:00:00.032) 0:01:41.707 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:19:05 +0000 (0:00:00.027) 0:01:41.735 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:19:05 +0000 (0:00:00.036) 0:01:41.771 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:19:05 +0000 (0:00:00.035) 0:01:41.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:19:05 +0000 (0:00:00.037) 0:01:41.844 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:19:05 +0000 (0:00:00.395) 0:01:42.240 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:19:06 +0000 (0:00:00.650) 0:01:42.891 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:19:06 +0000 (0:00:00.030) 0:01:42.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:19:06 +0000 (0:00:00.666) 0:01:43.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:19:07 +0000 (0:00:00.375) 0:01:43.963 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:19:07 +0000 (0:00:00.027) 0:01:43.991 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=718 changed=6 unreachable=0 failed=0 skipped=687 rescued=0 ignored=0 Wednesday 01 June 2022 17:19:08 +0000 (0:00:00.814) 0:01:44.805 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.25s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.25s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.65s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : set up new/current mounts ------------------ 1.31s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 set up internal repositories -------------------------------------------- 1.28s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : set up new/current mounts ------------------ 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:2 -------------- linux-system-roles.storage : Update facts ------------------------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure blivet is available -------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : remove obsolete mounts --------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:19:08 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:19:10 +0000 (0:00:01.287) 0:00:01.310 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_one_disk_multiple_volumes_nvme_generated.yml *************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:19:10 +0000 (0:00:00.020) 0:00:01.330 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.29s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:19:10 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:19:12 +0000 (0:00:01.269) 0:00:01.292 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_one_disk_multiple_volumes_scsi_generated.yml *************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes_scsi_generated.yml:3 Wednesday 01 June 2022 17:19:12 +0000 (0:00:00.018) 0:00:01.310 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes_scsi_generated.yml:7 Wednesday 01 June 2022 17:19:13 +0000 (0:00:01.085) 0:00:02.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:2 Wednesday 01 June 2022 17:19:13 +0000 (0:00:00.025) 0:00:02.421 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:11 Wednesday 01 June 2022 17:19:14 +0000 (0:00:00.780) 0:00:03.201 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:19:14 +0000 (0:00:00.039) 0:00:03.241 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:19:14 +0000 (0:00:00.159) 0:00:03.401 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:19:14 +0000 (0:00:00.535) 0:00:03.936 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:19:14 +0000 (0:00:00.071) 0:00:04.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:19:14 +0000 (0:00:00.022) 0:00:04.030 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:19:14 +0000 (0:00:00.021) 0:00:04.052 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:19:15 +0000 (0:00:00.197) 0:00:04.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:19:15 +0000 (0:00:00.019) 0:00:04.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:19:16 +0000 (0:00:01.059) 0:00:05.328 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:19:16 +0000 (0:00:00.045) 0:00:05.374 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:19:16 +0000 (0:00:00.044) 0:00:05.418 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:19:17 +0000 (0:00:00.704) 0:00:06.122 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:19:17 +0000 (0:00:00.081) 0:00:06.203 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:19:17 +0000 (0:00:00.021) 0:00:06.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:19:17 +0000 (0:00:00.024) 0:00:06.249 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:19:17 +0000 (0:00:00.020) 0:00:06.269 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:19:18 +0000 (0:00:00.850) 0:00:07.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:19:19 +0000 (0:00:01.771) 0:00:08.891 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:19:19 +0000 (0:00:00.043) 0:00:08.934 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:19:19 +0000 (0:00:00.026) 0:00:08.961 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.491) 0:00:09.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.029) 0:00:09.482 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.026) 0:00:09.508 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.033) 0:00:09.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.031) 0:00:09.573 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.033) 0:00:09.606 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.030) 0:00:09.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.028) 0:00:09.665 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.027) 0:00:09.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:19:20 +0000 (0:00:00.028) 0:00:09.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:19:21 +0000 (0:00:00.456) 0:00:10.177 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:19:21 +0000 (0:00:00.027) 0:00:10.205 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:14 Wednesday 01 June 2022 17:19:21 +0000 (0:00:00.787) 0:00:10.993 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:21 Wednesday 01 June 2022 17:19:21 +0000 (0:00:00.030) 0:00:11.023 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:19:21 +0000 (0:00:00.043) 0:00:11.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:19:22 +0000 (0:00:00.510) 0:00:11.577 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:19:22 +0000 (0:00:00.038) 0:00:11.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:19:22 +0000 (0:00:00.033) 0:00:11.650 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create three LVM logical volumes under one volume group] ***************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:26 Wednesday 01 June 2022 17:19:22 +0000 (0:00:00.035) 0:00:11.685 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:19:22 +0000 (0:00:00.055) 0:00:11.740 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:19:22 +0000 (0:00:00.044) 0:00:11.785 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.504) 0:00:12.289 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.069) 0:00:12.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.029) 0:00:12.388 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.028) 0:00:12.416 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.091) 0:00:12.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.024) 0:00:12.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.028) 0:00:12.561 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.034) 0:00:12.596 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.030) 0:00:12.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.030) 0:00:12.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.027) 0:00:12.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.027) 0:00:12.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.027) 0:00:12.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.041) 0:00:12.781 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:19:23 +0000 (0:00:00.026) 0:00:12.807 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:19:25 +0000 (0:00:02.244) 0:00:15.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:19:26 +0000 (0:00:00.028) 0:00:15.080 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:19:26 +0000 (0:00:00.030) 0:00:15.111 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:19:26 +0000 (0:00:00.050) 0:00:15.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:19:26 +0000 (0:00:00.041) 0:00:15.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:19:26 +0000 (0:00:00.036) 0:00:15.239 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:19:26 +0000 (0:00:00.030) 0:00:15.269 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:19:27 +0000 (0:00:00.950) 0:00:16.220 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:19:28 +0000 (0:00:01.260) 0:00:17.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:19:29 +0000 (0:00:00.676) 0:00:18.157 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:19:29 +0000 (0:00:00.366) 0:00:18.523 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:19:29 +0000 (0:00:00.029) 0:00:18.553 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:44 Wednesday 01 June 2022 17:19:30 +0000 (0:00:00.849) 0:00:19.403 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:19:30 +0000 (0:00:00.054) 0:00:19.457 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:19:30 +0000 (0:00:00.085) 0:00:19.543 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:19:30 +0000 (0:00:00.033) 0:00:19.576 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" }, "/dev/mapper/foo-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test2", "size": "3G", "type": "lvm", "uuid": "e5836ee7-616f-49db-aac9-e8962a820d3f" }, "/dev/mapper/foo-test3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test3", "size": "3G", "type": "lvm", "uuid": "868b8e29-657d-47c7-8628-756264fa208d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Bvu1GG-umF9-GYUs-qkzc-lmqk-Tz0s-1mvpd0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:19:30 +0000 (0:00:00.490) 0:00:20.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002542", "end": "2022-06-01 13:19:30.799183", "rc": 0, "start": "2022-06-01 13:19:30.796641" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 xfs defaults 0 0 /dev/mapper/foo-test3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:19:31 +0000 (0:00:00.460) 0:00:20.528 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003137", "end": "2022-06-01 13:19:31.201040", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:19:31.197903" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:19:31 +0000 (0:00:00.406) 0:00:20.935 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:19:31 +0000 (0:00:00.072) 0:00:21.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:19:31 +0000 (0:00:00.032) 0:00:21.040 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.065) 0:00:21.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.042) 0:00:21.148 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.533) 0:00:21.682 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.042) 0:00:21.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.037) 0:00:21.762 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.034) 0:00:21.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.038) 0:00:21.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.028) 0:00:21.864 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.044) 0:00:21.909 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.062) 0:00:21.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.034) 0:00:22.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.032) 0:00:22.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:19:32 +0000 (0:00:00.031) 0:00:22.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.029) 0:00:22.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.031) 0:00:22.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.031) 0:00:22.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.033) 0:00:22.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.031) 0:00:22.226 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.060) 0:00:22.287 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.086) 0:00:22.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.030) 0:00:22.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.030) 0:00:22.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.030) 0:00:22.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.031) 0:00:22.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.063) 0:00:22.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.031) 0:00:22.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.031) 0:00:22.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.029) 0:00:22.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.031) 0:00:22.684 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.064) 0:00:22.748 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.036) 0:00:22.784 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.034) 0:00:22.819 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.057) 0:00:22.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.036) 0:00:22.913 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.035) 0:00:22.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.029) 0:00:22.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.030) 0:00:23.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.030) 0:00:23.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:19:33 +0000 (0:00:00.030) 0:00:23.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.034) 0:00:23.103 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.064) 0:00:23.168 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.097) 0:00:23.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.031) 0:00:23.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.032) 0:00:23.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.481 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.031) 0:00:23.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.031) 0:00:23.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.032) 0:00:23.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.031) 0:00:23.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.730 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.035) 0:00:23.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.032) 0:00:23.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.035) 0:00:23.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.038) 0:00:23.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.030) 0:00:23.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.032) 0:00:23.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.032) 0:00:23.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.032) 0:00:24.030 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:19:34 +0000 (0:00:00.032) 0:00:24.063 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.082) 0:00:24.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.038) 0:00:24.184 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.136) 0:00:24.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.036) 0:00:24.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.098) 0:00:24.456 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.039) 0:00:24.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.035) 0:00:24.531 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.038) 0:00:24.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.032) 0:00:24.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.029) 0:00:24.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.032) 0:00:24.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.034) 0:00:24.698 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.049) 0:00:24.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.035) 0:00:24.783 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.036) 0:00:24.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.031) 0:00:24.851 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.032) 0:00:24.883 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.039) 0:00:24.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:19:35 +0000 (0:00:00.039) 0:00:24.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103965.2331214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103965.2331214, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17778, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103965.2331214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.382) 0:00:25.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.039) 0:00:25.384 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.036) 0:00:25.421 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.034) 0:00:25.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.030) 0:00:25.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.035) 0:00:25.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.031) 0:00:25.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.031) 0:00:25.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.031) 0:00:25.616 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.040) 0:00:25.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.031) 0:00:25.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.029) 0:00:25.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.031) 0:00:25.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.032) 0:00:25.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.030) 0:00:25.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.042) 0:00:25.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.036) 0:00:25.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.030) 0:00:25.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.031) 0:00:25.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.031) 0:00:25.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.032) 0:00:26.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:19:36 +0000 (0:00:00.033) 0:00:26.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:19:37 +0000 (0:00:00.032) 0:00:26.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:19:37 +0000 (0:00:00.031) 0:00:26.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:19:37 +0000 (0:00:00.032) 0:00:26.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:19:37 +0000 (0:00:00.031) 0:00:26.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:19:37 +0000 (0:00:00.028) 0:00:26.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:19:37 +0000 (0:00:00.031) 0:00:26.238 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:19:37 +0000 (0:00:00.517) 0:00:26.756 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.385) 0:00:27.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.039) 0:00:27.181 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.035) 0:00:27.216 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.033) 0:00:27.250 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.030) 0:00:27.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.071) 0:00:27.352 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.032) 0:00:27.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.032) 0:00:27.417 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.034) 0:00:27.451 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.033) 0:00:27.485 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.039) 0:00:27.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.040004", "end": "2022-06-01 13:19:38.204788", "rc": 0, "start": "2022-06-01 13:19:38.164784" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.413) 0:00:27.938 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.040) 0:00:27.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.041) 0:00:28.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:19:38 +0000 (0:00:00.035) 0:00:28.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.033) 0:00:28.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.033) 0:00:28.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.033) 0:00:28.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.034) 0:00:28.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.037) 0:00:28.227 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.125) 0:00:28.353 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.037) 0:00:28.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e5836ee7-616f-49db-aac9-e8962a820d3f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e5836ee7-616f-49db-aac9-e8962a820d3f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.041) 0:00:28.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.039) 0:00:28.472 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.035) 0:00:28.508 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.037) 0:00:28.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.031) 0:00:28.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.031) 0:00:28.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.030) 0:00:28.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.037) 0:00:28.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.047) 0:00:28.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.036) 0:00:28.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.037) 0:00:28.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.030) 0:00:28.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.033) 0:00:28.861 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.040) 0:00:28.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:19:39 +0000 (0:00:00.035) 0:00:28.938 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103965.0021214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103965.0021214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17745, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103965.0021214, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.392) 0:00:29.331 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.038) 0:00:29.369 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.036) 0:00:29.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.031) 0:00:29.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.030) 0:00:29.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.035) 0:00:29.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.030) 0:00:29.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.031) 0:00:29.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.041) 0:00:29.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.044) 0:00:29.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.032) 0:00:29.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.036) 0:00:29.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.032) 0:00:29.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.031) 0:00:29.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.031) 0:00:29.816 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.040) 0:00:29.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.035) 0:00:29.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.080) 0:00:29.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.032) 0:00:30.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.031) 0:00:30.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:19:40 +0000 (0:00:00.032) 0:00:30.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.031) 0:00:30.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.030) 0:00:30.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.035) 0:00:30.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.031) 0:00:30.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.031) 0:00:30.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.032) 0:00:30.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.031) 0:00:30.295 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:19:41 +0000 (0:00:00.380) 0:00:30.676 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.398) 0:00:31.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.064) 0:00:31.138 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.069) 0:00:31.208 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.059) 0:00:31.268 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.059) 0:00:31.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.059) 0:00:31.387 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.053) 0:00:31.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.034) 0:00:31.475 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.037) 0:00:31.513 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.032) 0:00:31.546 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.040) 0:00:31.587 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.035554", "end": "2022-06-01 13:19:42.286798", "rc": 0, "start": "2022-06-01 13:19:42.251244" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.432) 0:00:32.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:19:42 +0000 (0:00:00.038) 0:00:32.057 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.046) 0:00:32.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.036) 0:00:32.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.036) 0:00:32.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.033) 0:00:32.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.032) 0:00:32.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.032) 0:00:32.275 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.035) 0:00:32.310 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.124) 0:00:32.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.036) 0:00:32.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "868b8e29-657d-47c7-8628-756264fa208d" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "868b8e29-657d-47c7-8628-756264fa208d" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.043) 0:00:32.514 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.038) 0:00:32.552 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.035) 0:00:32.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.037) 0:00:32.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.033) 0:00:32.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.031) 0:00:32.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.030) 0:00:32.721 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.030) 0:00:32.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.045) 0:00:32.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.034) 0:00:32.832 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.082) 0:00:32.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.032) 0:00:32.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.031) 0:00:32.979 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.037) 0:00:33.017 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:19:43 +0000 (0:00:00.036) 0:00:33.054 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103964.7581215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103964.7581215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17711, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103964.7581215, "nlink": 1, "path": "/dev/mapper/foo-test3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.391) 0:00:33.446 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.038) 0:00:33.484 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.035) 0:00:33.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.033) 0:00:33.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.030) 0:00:33.584 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.034) 0:00:33.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.029) 0:00:33.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.032) 0:00:33.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.034) 0:00:33.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.038) 0:00:33.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.030) 0:00:33.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.030) 0:00:33.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.030) 0:00:33.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.035) 0:00:33.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.039) 0:00:33.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.039) 0:00:33.960 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.036) 0:00:33.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.032) 0:00:34.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:19:44 +0000 (0:00:00.030) 0:00:34.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.031) 0:00:34.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.034) 0:00:34.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.031) 0:00:34.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.032) 0:00:34.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.030) 0:00:34.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.031) 0:00:34.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.032) 0:00:34.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.035) 0:00:34.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.036) 0:00:34.356 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:19:45 +0000 (0:00:00.376) 0:00:34.732 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.366) 0:00:35.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.039) 0:00:35.137 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.035) 0:00:35.173 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.033) 0:00:35.206 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.032) 0:00:35.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.032) 0:00:35.271 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.033) 0:00:35.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.032) 0:00:35.337 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.035) 0:00:35.372 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.034) 0:00:35.406 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.040) 0:00:35.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test3" ], "delta": "0:00:00.038484", "end": "2022-06-01 13:19:46.160724", "rc": 0, "start": "2022-06-01 13:19:46.122240" } STDOUT: LVM2_LV_NAME=test3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.447) 0:00:35.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.041) 0:00:35.936 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.040) 0:00:35.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.033) 0:00:36.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:19:46 +0000 (0:00:00.034) 0:00:36.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.035) 0:00:36.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.033) 0:00:36.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.036) 0:00:36.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.082) 0:00:36.232 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.029) 0:00:36.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:46 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.031) 0:00:36.293 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.064) 0:00:36.357 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.049) 0:00:36.407 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.551) 0:00:36.959 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:19:47 +0000 (0:00:00.083) 0:00:37.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.031) 0:00:37.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.032) 0:00:37.106 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.066) 0:00:37.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.029) 0:00:37.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.030) 0:00:37.232 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.036) 0:00:37.268 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.033) 0:00:37.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.033) 0:00:37.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.032) 0:00:37.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.034) 0:00:37.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.031) 0:00:37.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.046) 0:00:37.480 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:19:48 +0000 (0:00:00.030) 0:00:37.510 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:19:50 +0000 (0:00:01.621) 0:00:39.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:19:50 +0000 (0:00:00.032) 0:00:39.164 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:19:50 +0000 (0:00:00.031) 0:00:39.196 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/mapper/foo-test3", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:19:50 +0000 (0:00:00.047) 0:00:39.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:19:50 +0000 (0:00:00.043) 0:00:39.287 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:19:50 +0000 (0:00:00.033) 0:00:39.321 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:19:50 +0000 (0:00:00.031) 0:00:39.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:19:50 +0000 (0:00:00.674) 0:00:40.028 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:19:52 +0000 (0:00:01.080) 0:00:41.108 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:19:52 +0000 (0:00:00.662) 0:00:41.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:19:53 +0000 (0:00:00.382) 0:00:42.153 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:19:53 +0000 (0:00:00.036) 0:00:42.189 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:64 Wednesday 01 June 2022 17:19:53 +0000 (0:00:00.860) 0:00:43.050 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:19:54 +0000 (0:00:00.058) 0:00:43.109 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:19:54 +0000 (0:00:00.048) 0:00:43.157 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:19:54 +0000 (0:00:00.032) 0:00:43.190 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" }, "/dev/mapper/foo-test2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test2", "size": "3G", "type": "lvm", "uuid": "e5836ee7-616f-49db-aac9-e8962a820d3f" }, "/dev/mapper/foo-test3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test3", "size": "3G", "type": "lvm", "uuid": "868b8e29-657d-47c7-8628-756264fa208d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Bvu1GG-umF9-GYUs-qkzc-lmqk-Tz0s-1mvpd0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:19:54 +0000 (0:00:00.393) 0:00:43.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002954", "end": "2022-06-01 13:19:54.220463", "rc": 0, "start": "2022-06-01 13:19:54.217509" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 xfs defaults 0 0 /dev/mapper/foo-test3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:19:54 +0000 (0:00:00.366) 0:00:43.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003736", "end": "2022-06-01 13:19:54.596794", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:19:54.593058" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:19:55 +0000 (0:00:00.387) 0:00:44.338 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:19:55 +0000 (0:00:00.076) 0:00:44.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:19:55 +0000 (0:00:00.034) 0:00:44.449 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:19:55 +0000 (0:00:00.066) 0:00:44.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:19:55 +0000 (0:00:00.041) 0:00:44.557 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:19:55 +0000 (0:00:00.441) 0:00:44.998 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:19:55 +0000 (0:00:00.044) 0:00:45.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.039) 0:00:45.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.036) 0:00:45.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.037) 0:00:45.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.031) 0:00:45.188 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.047) 0:00:45.235 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.057) 0:00:45.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.030) 0:00:45.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.031) 0:00:45.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.033) 0:00:45.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.031) 0:00:45.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.030) 0:00:45.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.031) 0:00:45.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.032) 0:00:45.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.032) 0:00:45.547 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.064) 0:00:45.611 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.093) 0:00:45.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.039) 0:00:45.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.035) 0:00:45.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.034) 0:00:45.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.033) 0:00:45.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.033) 0:00:45.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.032) 0:00:45.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.035) 0:00:45.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.032) 0:00:45.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:19:56 +0000 (0:00:00.029) 0:00:46.011 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.063) 0:00:46.075 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.038) 0:00:46.113 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.033) 0:00:46.147 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.056) 0:00:46.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.033) 0:00:46.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.035) 0:00:46.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.035) 0:00:46.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.031) 0:00:46.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.030) 0:00:46.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.032) 0:00:46.402 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.033) 0:00:46.435 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.066) 0:00:46.502 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.097) 0:00:46.600 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.032) 0:00:46.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.032) 0:00:46.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.033) 0:00:46.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.033) 0:00:46.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.032) 0:00:46.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.082) 0:00:46.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.033) 0:00:46.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.031) 0:00:46.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.033) 0:00:46.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.039) 0:00:46.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.037) 0:00:47.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:19:57 +0000 (0:00:00.039) 0:00:47.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.033) 0:00:47.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.034) 0:00:47.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.032) 0:00:47.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.033) 0:00:47.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.032) 0:00:47.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.034) 0:00:47.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.032) 0:00:47.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.031) 0:00:47.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.032) 0:00:47.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.031) 0:00:47.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.031) 0:00:47.421 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.034) 0:00:47.455 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.032) 0:00:47.487 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.083) 0:00:47.571 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.038) 0:00:47.609 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.126) 0:00:47.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.036) 0:00:47.773 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.045) 0:00:47.819 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.038) 0:00:47.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.039) 0:00:47.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.039) 0:00:47.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.031) 0:00:47.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.032) 0:00:48.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.031) 0:00:48.032 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:19:58 +0000 (0:00:00.032) 0:00:48.065 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.050) 0:00:48.115 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.034) 0:00:48.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.036) 0:00:48.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.030) 0:00:48.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.032) 0:00:48.250 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.037) 0:00:48.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.040) 0:00:48.328 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103965.2331214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103965.2331214, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17778, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103965.2331214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.384) 0:00:48.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.039) 0:00:48.752 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.036) 0:00:48.789 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.034) 0:00:48.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.031) 0:00:48.855 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.036) 0:00:48.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.033) 0:00:48.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.031) 0:00:48.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.036) 0:00:48.993 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:19:59 +0000 (0:00:00.046) 0:00:49.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.034) 0:00:49.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.032) 0:00:49.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.035) 0:00:49.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.085) 0:00:49.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.032) 0:00:49.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.040) 0:00:49.300 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.039) 0:00:49.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.032) 0:00:49.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.031) 0:00:49.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.032) 0:00:49.436 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.031) 0:00:49.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.033) 0:00:49.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.032) 0:00:49.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.031) 0:00:49.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.033) 0:00:49.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.036) 0:00:49.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.033) 0:00:49.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:00 +0000 (0:00:00.032) 0:00:49.701 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.377) 0:00:50.079 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.382) 0:00:50.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.037) 0:00:50.498 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.034) 0:00:50.533 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.033) 0:00:50.567 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.031) 0:00:50.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.034) 0:00:50.633 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.031) 0:00:50.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.033) 0:00:50.698 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.038) 0:00:50.737 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.037) 0:00:50.774 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:01 +0000 (0:00:00.042) 0:00:50.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035642", "end": "2022-06-01 13:20:01.506964", "rc": 0, "start": "2022-06-01 13:20:01.471322" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.425) 0:00:51.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.040) 0:00:51.282 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.039) 0:00:51.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.033) 0:00:51.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.034) 0:00:51.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.033) 0:00:51.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.036) 0:00:51.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.032) 0:00:51.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.036) 0:00:51.530 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.140) 0:00:51.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.038) 0:00:51.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e5836ee7-616f-49db-aac9-e8962a820d3f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "e5836ee7-616f-49db-aac9-e8962a820d3f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.044) 0:00:51.753 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.041) 0:00:51.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.036) 0:00:51.830 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.038) 0:00:51.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.031) 0:00:51.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.033) 0:00:51.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.038) 0:00:51.973 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:02 +0000 (0:00:00.079) 0:00:52.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.048) 0:00:52.102 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.037) 0:00:52.139 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.038) 0:00:52.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.032) 0:00:52.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.035) 0:00:52.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.043) 0:00:52.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.041) 0:00:52.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103965.0021214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103965.0021214, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17745, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103965.0021214, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.378) 0:00:52.708 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.039) 0:00:52.748 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.037) 0:00:52.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.034) 0:00:52.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.033) 0:00:52.853 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.036) 0:00:52.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.031) 0:00:52.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.033) 0:00:52.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.032) 0:00:52.987 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.040) 0:00:53.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:03 +0000 (0:00:00.035) 0:00:53.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.031) 0:00:53.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.031) 0:00:53.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.031) 0:00:53.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.037) 0:00:53.227 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.038) 0:00:53.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.363 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.035) 0:00:53.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.033) 0:00:53.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.031) 0:00:53.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.031) 0:00:53.594 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.032) 0:00:53.626 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:04 +0000 (0:00:00.378) 0:00:54.004 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.390) 0:00:54.395 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.042) 0:00:54.437 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.033) 0:00:54.470 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.033) 0:00:54.504 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.030) 0:00:54.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.031) 0:00:54.566 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.031) 0:00:54.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.031) 0:00:54.630 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.039) 0:00:54.670 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.035) 0:00:54.706 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:05 +0000 (0:00:00.041) 0:00:54.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.035996", "end": "2022-06-01 13:20:05.430631", "rc": 0, "start": "2022-06-01 13:20:05.394635" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.417) 0:00:55.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.040) 0:00:55.206 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.040) 0:00:55.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.033) 0:00:55.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.084) 0:00:55.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.034) 0:00:55.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.033) 0:00:55.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.032) 0:00:55.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.039) 0:00:55.505 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.132) 0:00:55.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.040) 0:00:55.678 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "868b8e29-657d-47c7-8628-756264fa208d" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "868b8e29-657d-47c7-8628-756264fa208d" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.044) 0:00:55.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.040) 0:00:55.763 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.037) 0:00:55.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.040) 0:00:55.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.035) 0:00:55.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.031) 0:00:55.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.030) 0:00:55.938 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.033) 0:00:55.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.048) 0:00:56.020 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:06 +0000 (0:00:00.036) 0:00:56.056 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.040) 0:00:56.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.029) 0:00:56.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.030) 0:00:56.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.039) 0:00:56.198 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.039) 0:00:56.237 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103964.7581215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103964.7581215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17711, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103964.7581215, "nlink": 1, "path": "/dev/mapper/foo-test3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.412) 0:00:56.649 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.037) 0:00:56.687 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.037) 0:00:56.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.036) 0:00:56.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.029) 0:00:56.791 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.033) 0:00:56.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.029) 0:00:56.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.034) 0:00:56.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.032) 0:00:56.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.058) 0:00:56.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.037) 0:00:57.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:07 +0000 (0:00:00.032) 0:00:57.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.032) 0:00:57.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.036) 0:00:57.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.033) 0:00:57.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.040) 0:00:57.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.038) 0:00:57.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.032) 0:00:57.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.032) 0:00:57.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.036) 0:00:57.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.033) 0:00:57.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.033) 0:00:57.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.032) 0:00:57.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.032) 0:00:57.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.033) 0:00:57.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.035) 0:00:57.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.032) 0:00:57.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.032) 0:00:57.599 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:08 +0000 (0:00:00.416) 0:00:58.015 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.392) 0:00:58.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.039) 0:00:58.447 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.032) 0:00:58.480 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.032) 0:00:58.512 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.031) 0:00:58.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.032) 0:00:58.576 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.031) 0:00:58.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.033) 0:00:58.640 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.035) 0:00:58.676 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.032) 0:00:58.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:09 +0000 (0:00:00.038) 0:00:58.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test3" ], "delta": "0:00:00.047280", "end": "2022-06-01 13:20:09.453257", "rc": 0, "start": "2022-06-01 13:20:09.405977" } STDOUT: LVM2_LV_NAME=test3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.446) 0:00:59.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.042) 0:00:59.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.041) 0:00:59.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.034) 0:00:59.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.032) 0:00:59.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.034) 0:00:59.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.033) 0:00:59.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.032) 0:00:59.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.032) 0:00:59.477 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.029) 0:00:59.507 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove two of the LVs] *************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:66 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.031) 0:00:59.539 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.075) 0:00:59.614 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:20:10 +0000 (0:00:00.047) 0:00:59.661 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.540) 0:01:00.201 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.077) 0:01:00.279 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.034) 0:01:00.314 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.033) 0:01:00.347 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.068) 0:01:00.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.029) 0:01:00.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.033) 0:01:00.478 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g", "state": "absent" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.039) 0:01:00.518 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.035) 0:01:00.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.033) 0:01:00.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.035) 0:01:00.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.033) 0:01:00.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.031) 0:01:00.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.046) 0:01:00.736 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:20:11 +0000 (0:00:00.029) 0:01:00.765 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:20:14 +0000 (0:00:02.385) 0:01:03.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:20:14 +0000 (0:00:00.032) 0:01:03.183 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:20:14 +0000 (0:00:00.031) 0:01:03.215 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:20:14 +0000 (0:00:00.048) 0:01:03.263 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:20:14 +0000 (0:00:00.047) 0:01:03.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:20:14 +0000 (0:00:00.037) 0:01:03.349 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test3', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/foo-test3", "state": "absent" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test3" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:20:15 +0000 (0:00:00.763) 0:01:04.113 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:20:15 +0000 (0:00:00.690) 0:01:04.803 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:20:16 +0000 (0:00:00.415) 0:01:05.218 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:20:16 +0000 (0:00:00.677) 0:01:05.896 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:20:17 +0000 (0:00:00.379) 0:01:06.276 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:20:17 +0000 (0:00:00.031) 0:01:06.307 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:86 Wednesday 01 June 2022 17:20:18 +0000 (0:00:00.874) 0:01:07.182 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:20:18 +0000 (0:00:00.065) 0:01:07.248 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test3", "_mount_id": "/dev/mapper/foo-test3", "_raw_device": "/dev/mapper/foo-test3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:20:18 +0000 (0:00:00.045) 0:01:07.294 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:20:18 +0000 (0:00:00.029) 0:01:07.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Bvu1GG-umF9-GYUs-qkzc-lmqk-Tz0s-1mvpd0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:20:18 +0000 (0:00:00.379) 0:01:07.703 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003495", "end": "2022-06-01 13:20:18.351014", "rc": 0, "start": "2022-06-01 13:20:18.347519" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:20:19 +0000 (0:00:00.395) 0:01:08.098 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002571", "end": "2022-06-01 13:20:18.749538", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:20:18.746967" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:20:19 +0000 (0:00:00.381) 0:01:08.479 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:20:19 +0000 (0:00:00.075) 0:01:08.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:20:19 +0000 (0:00:00.031) 0:01:08.586 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:20:19 +0000 (0:00:00.063) 0:01:08.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:20:19 +0000 (0:00:00.042) 0:01:08.692 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.401) 0:01:09.093 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.044) 0:01:09.138 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.040) 0:01:09.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.036) 0:01:09.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.036) 0:01:09.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:09.283 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.046) 0:01:09.329 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.058) 0:01:09.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.032) 0:01:09.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:09.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.034) 0:01:09.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:09.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:09.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.033) 0:01:09.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.033) 0:01:09.616 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.032) 0:01:09.648 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.062) 0:01:09.710 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.087) 0:01:09.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.038) 0:01:09.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.034) 0:01:09.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.032) 0:01:09.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:09.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:09.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:09.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.032) 0:01:10.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:20:20 +0000 (0:00:00.031) 0:01:10.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.031) 0:01:10.091 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.063) 0:01:10.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.039) 0:01:10.195 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.036) 0:01:10.232 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.060) 0:01:10.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.034) 0:01:10.326 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.034) 0:01:10.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.030) 0:01:10.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.027) 0:01:10.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.027) 0:01:10.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.028) 0:01:10.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.028) 0:01:10.503 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.114) 0:01:10.617 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.099) 0:01:10.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.032) 0:01:10.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.029) 0:01:10.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.031) 0:01:10.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.031) 0:01:10.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.030) 0:01:10.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.034) 0:01:10.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.030) 0:01:10.937 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.032) 0:01:10.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.030) 0:01:11.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.030) 0:01:11.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:20:21 +0000 (0:00:00.030) 0:01:11.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.032) 0:01:11.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.028) 0:01:11.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.030) 0:01:11.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.031) 0:01:11.184 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.032) 0:01:11.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.029) 0:01:11.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.031) 0:01:11.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.032) 0:01:11.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.030) 0:01:11.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.031) 0:01:11.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.031) 0:01:11.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.031) 0:01:11.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.035) 0:01:11.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.037) 0:01:11.508 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.087) 0:01:11.595 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.046) 0:01:11.642 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.137) 0:01:11.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.038) 0:01:11.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.046) 0:01:11.864 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.038) 0:01:11.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.037) 0:01:11.940 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.044) 0:01:11.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.033) 0:01:12.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:22 +0000 (0:00:00.029) 0:01:12.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.032) 0:01:12.080 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.032) 0:01:12.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.051) 0:01:12.165 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.037) 0:01:12.202 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.037) 0:01:12.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.028) 0:01:12.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.029) 0:01:12.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.038) 0:01:12.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.039) 0:01:12.374 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103965.2331214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103965.2331214, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17778, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103965.2331214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.370) 0:01:12.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.089) 0:01:12.834 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.038) 0:01:12.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.036) 0:01:12.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.030) 0:01:12.939 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.033) 0:01:12.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.028) 0:01:13.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:23 +0000 (0:00:00.032) 0:01:13.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.037) 0:01:13.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.040) 0:01:13.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.032) 0:01:13.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.031) 0:01:13.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.029) 0:01:13.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.032) 0:01:13.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.029) 0:01:13.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.035) 0:01:13.304 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.032) 0:01:13.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.028) 0:01:13.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.028) 0:01:13.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.033) 0:01:13.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.031) 0:01:13.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.029) 0:01:13.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.030) 0:01:13.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.028) 0:01:13.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.030) 0:01:13.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.034) 0:01:13.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.032) 0:01:13.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.031) 0:01:13.676 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:24 +0000 (0:00:00.371) 0:01:14.047 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.377) 0:01:14.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.045) 0:01:14.470 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.039) 0:01:14.509 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.032) 0:01:14.542 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.033) 0:01:14.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.035) 0:01:14.610 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.033) 0:01:14.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.033) 0:01:14.677 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.038) 0:01:14.715 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.036) 0:01:14.752 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:25 +0000 (0:00:00.039) 0:01:14.791 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035225", "end": "2022-06-01 13:20:25.480727", "rc": 0, "start": "2022-06-01 13:20:25.445502" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.423) 0:01:15.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.040) 0:01:15.255 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.040) 0:01:15.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.032) 0:01:15.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.032) 0:01:15.361 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.032) 0:01:15.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.032) 0:01:15.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.034) 0:01:15.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.036) 0:01:15.497 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.176) 0:01:15.674 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.036) 0:01:15.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.040) 0:01:15.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.030) 0:01:15.782 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.040) 0:01:15.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.034) 0:01:15.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.034) 0:01:15.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.035) 0:01:15.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.033) 0:01:15.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.034) 0:01:15.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:26 +0000 (0:00:00.051) 0:01:16.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.026) 0:01:16.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.036) 0:01:16.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.033) 0:01:16.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.034) 0:01:16.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.033) 0:01:16.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.033) 0:01:16.243 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.377) 0:01:16.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.041) 0:01:16.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.029) 0:01:16.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.035) 0:01:16.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.032) 0:01:16.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.025) 0:01:16.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.028) 0:01:16.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.030) 0:01:16.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.028) 0:01:16.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.025) 0:01:16.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.029) 0:01:16.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.034) 0:01:16.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.032) 0:01:16.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.033) 0:01:17.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:27 +0000 (0:00:00.033) 0:01:17.062 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.040) 0:01:17.102 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.036) 0:01:17.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.030) 0:01:17.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.031) 0:01:17.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.033) 0:01:17.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.031) 0:01:17.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.032) 0:01:17.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.032) 0:01:17.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.034) 0:01:17.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.032) 0:01:17.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.038) 0:01:17.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.041) 0:01:17.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.036) 0:01:17.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.031) 0:01:17.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.030) 0:01:17.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.029) 0:01:17.607 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.037) 0:01:17.644 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.035) 0:01:17.679 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.030) 0:01:17.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.031) 0:01:17.741 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.029) 0:01:17.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.029) 0:01:17.800 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.035) 0:01:17.836 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.031) 0:01:17.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.028) 0:01:17.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.028) 0:01:17.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.037) 0:01:17.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:28 +0000 (0:00:00.030) 0:01:17.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.086) 0:01:18.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.033) 0:01:18.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.176 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.036) 0:01:18.245 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.126) 0:01:18.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.041) 0:01:18.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.041) 0:01:18.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.485 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.039) 0:01:18.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.033) 0:01:18.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.028) 0:01:18.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.032) 0:01:18.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.046) 0:01:18.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.026) 0:01:18.756 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.036) 0:01:18.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.033) 0:01:18.827 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.031) 0:01:18.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:29 +0000 (0:00:00.027) 0:01:18.917 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.378) 0:01:19.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.035) 0:01:19.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.025) 0:01:19.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.033) 0:01:19.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.030) 0:01:19.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.027) 0:01:19.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.030) 0:01:19.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.031) 0:01:19.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.031) 0:01:19.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.030) 0:01:19.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.032) 0:01:19.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.030) 0:01:19.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.031) 0:01:19.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.031) 0:01:19.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.033) 0:01:19.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.044) 0:01:19.778 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.034) 0:01:19.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.028) 0:01:19.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.028) 0:01:19.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.028) 0:01:19.897 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.029) 0:01:19.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.030) 0:01:19.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.028) 0:01:19.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.028) 0:01:20.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.028) 0:01:20.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:30 +0000 (0:00:00.028) 0:01:20.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.029) 0:01:20.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.032) 0:01:20.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.029) 0:01:20.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.028) 0:01:20.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.028) 0:01:20.219 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.031) 0:01:20.251 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.029) 0:01:20.280 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.031) 0:01:20.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.068) 0:01:20.380 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.031) 0:01:20.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.029) 0:01:20.441 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.031) 0:01:20.473 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.031) 0:01:20.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.032) 0:01:20.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.036) 0:01:20.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.033) 0:01:20.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.031) 0:01:20.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.035) 0:01:20.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.033) 0:01:20.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.032) 0:01:20.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.038) 0:01:20.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.034) 0:01:20.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.032) 0:01:20.846 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.027) 0:01:20.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Re-run the previous role invocation to ensure idempotence] *************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:88 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.031) 0:01:20.905 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.078) 0:01:20.983 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:20:31 +0000 (0:00:00.045) 0:01:21.029 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.546) 0:01:21.576 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.082) 0:01:21.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.031) 0:01:21.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.029) 0:01:21.719 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.067) 0:01:21.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.027) 0:01:21.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.031) 0:01:21.846 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "3g", "state": "absent" }, { "mount_point": "/opt/test3", "name": "test3", "size": "3g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.038) 0:01:21.884 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.038) 0:01:21.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.035) 0:01:21.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.037) 0:01:21.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.033) 0:01:22.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:20:32 +0000 (0:00:00.031) 0:01:22.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:20:33 +0000 (0:00:00.044) 0:01:22.105 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:20:33 +0000 (0:00:00.032) 0:01:22.138 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:20:34 +0000 (0:00:01.367) 0:01:23.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:20:34 +0000 (0:00:00.032) 0:01:23.537 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:20:34 +0000 (0:00:00.029) 0:01:23.567 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:20:34 +0000 (0:00:00.045) 0:01:23.613 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:20:34 +0000 (0:00:00.046) 0:01:23.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:20:34 +0000 (0:00:00.091) 0:01:23.751 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:20:34 +0000 (0:00:00.034) 0:01:23.785 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:20:35 +0000 (0:00:00.684) 0:01:24.470 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:20:35 +0000 (0:00:00.391) 0:01:24.862 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:20:36 +0000 (0:00:00.655) 0:01:25.518 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:20:36 +0000 (0:00:00.391) 0:01:25.909 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:20:36 +0000 (0:00:00.031) 0:01:25.940 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:108 Wednesday 01 June 2022 17:20:37 +0000 (0:00:00.876) 0:01:26.817 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:20:37 +0000 (0:00:00.067) 0:01:26.885 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "test3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:20:37 +0000 (0:00:00.047) 0:01:26.932 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:20:37 +0000 (0:00:00.037) 0:01:26.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "Bvu1GG-umF9-GYUs-qkzc-lmqk-Tz0s-1mvpd0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:20:38 +0000 (0:00:00.405) 0:01:27.375 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002512", "end": "2022-06-01 13:20:38.011241", "rc": 0, "start": "2022-06-01 13:20:38.008729" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:20:38 +0000 (0:00:00.369) 0:01:27.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003400", "end": "2022-06-01 13:20:38.399054", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:20:38.395654" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.389) 0:01:28.134 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.075) 0:01:28.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.033) 0:01:28.243 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.067) 0:01:28.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.084) 0:01:28.395 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.379) 0:01:28.775 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.044) 0:01:28.820 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.040) 0:01:28.861 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.036) 0:01:28.897 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.037) 0:01:28.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.033) 0:01:28.968 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.042) 0:01:29.011 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:20:39 +0000 (0:00:00.054) 0:01:29.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.030) 0:01:29.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.034) 0:01:29.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.029) 0:01:29.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.028) 0:01:29.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.032) 0:01:29.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.041) 0:01:29.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.035) 0:01:29.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.037) 0:01:29.337 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.062) 0:01:29.399 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.089) 0:01:29.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.033) 0:01:29.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.032) 0:01:29.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.033) 0:01:29.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.034) 0:01:29.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.032) 0:01:29.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.035) 0:01:29.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.034) 0:01:29.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.034) 0:01:29.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.032) 0:01:29.792 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.064) 0:01:29.857 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.039) 0:01:29.897 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.036) 0:01:29.933 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.059) 0:01:29.993 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:20:40 +0000 (0:00:00.039) 0:01:30.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.040) 0:01:30.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.031) 0:01:30.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.031) 0:01:30.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.031) 0:01:30.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.033) 0:01:30.200 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:30.233 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.068) 0:01:30.302 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.095) 0:01:30.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:30.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:30.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.029) 0:01:30.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.029) 0:01:30.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.031) 0:01:30.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.096) 0:01:30.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.034) 0:01:30.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.033) 0:01:30.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.034) 0:01:30.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:30.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.033) 0:01:30.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.035) 0:01:30.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:30.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.031) 0:01:30.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:30.949 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:30.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.032) 0:01:31.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:20:41 +0000 (0:00:00.036) 0:01:31.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.033) 0:01:31.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.032) 0:01:31.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.031) 0:01:31.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.031) 0:01:31.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.032) 0:01:31.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.035) 0:01:31.249 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.032) 0:01:31.281 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.083) 0:01:31.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.040) 0:01:31.405 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.126) 0:01:31.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.038) 0:01:31.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "52ca2c0a-dfe3-4af2-96ac-d02391d00934" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.049) 0:01:31.619 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.043) 0:01:31.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.039) 0:01:31.702 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.046) 0:01:31.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.032) 0:01:31.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.032) 0:01:31.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.032) 0:01:31.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.033) 0:01:31.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.050) 0:01:31.930 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.042) 0:01:31.973 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.042) 0:01:32.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:42 +0000 (0:00:00.032) 0:01:32.048 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.033) 0:01:32.081 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.042) 0:01:32.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.042) 0:01:32.166 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103965.2331214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654103965.2331214, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 17778, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654103965.2331214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.401) 0:01:32.568 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.038) 0:01:32.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.037) 0:01:32.645 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.035) 0:01:32.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.032) 0:01:32.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.037) 0:01:32.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.034) 0:01:32.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.032) 0:01:32.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.032) 0:01:32.850 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.040) 0:01:32.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.033) 0:01:32.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.031) 0:01:32.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:43 +0000 (0:00:00.035) 0:01:32.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.080) 0:01:33.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.032) 0:01:33.105 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.040) 0:01:33.145 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.041) 0:01:33.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.034) 0:01:33.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.030) 0:01:33.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.036) 0:01:33.288 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.034) 0:01:33.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.032) 0:01:33.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.031) 0:01:33.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.032) 0:01:33.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.033) 0:01:33.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.037) 0:01:33.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.031) 0:01:33.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.033) 0:01:33.556 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:44 +0000 (0:00:00.385) 0:01:33.941 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.385) 0:01:34.326 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.038) 0:01:34.365 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.035) 0:01:34.401 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.032) 0:01:34.433 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.033) 0:01:34.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.032) 0:01:34.499 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.032) 0:01:34.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.032) 0:01:34.564 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.036) 0:01:34.600 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.036) 0:01:34.636 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:45 +0000 (0:00:00.044) 0:01:34.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.040628", "end": "2022-06-01 13:20:45.383997", "rc": 0, "start": "2022-06-01 13:20:45.343369" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.442) 0:01:35.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.038) 0:01:35.162 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.037) 0:01:35.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.031) 0:01:35.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.030) 0:01:35.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.032) 0:01:35.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.030) 0:01:35.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.029) 0:01:35.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.033) 0:01:35.389 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.127) 0:01:35.516 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.036) 0:01:35.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.043) 0:01:35.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.032) 0:01:35.629 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.041) 0:01:35.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.032) 0:01:35.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.035) 0:01:35.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.031) 0:01:35.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.035) 0:01:35.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.084) 0:01:35.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.049) 0:01:35.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.025) 0:01:35.965 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.037) 0:01:36.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.029) 0:01:36.032 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:46 +0000 (0:00:00.033) 0:01:36.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.029) 0:01:36.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.025) 0:01:36.121 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.358) 0:01:36.479 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.039) 0:01:36.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.025) 0:01:36.544 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.032) 0:01:36.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.031) 0:01:36.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.027) 0:01:36.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.031) 0:01:36.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.031) 0:01:36.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.028) 0:01:36.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.026) 0:01:36.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.029) 0:01:36.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.030) 0:01:36.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.029) 0:01:36.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.030) 0:01:36.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.031) 0:01:36.906 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.043) 0:01:36.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.040) 0:01:36.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.036) 0:01:37.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:47 +0000 (0:00:00.031) 0:01:37.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.030) 0:01:37.090 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.032) 0:01:37.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.034) 0:01:37.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.029) 0:01:37.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.034) 0:01:37.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.034) 0:01:37.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.032) 0:01:37.444 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.033) 0:01:37.477 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.030) 0:01:37.508 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.029) 0:01:37.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.569 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.032) 0:01:37.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.032) 0:01:37.634 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.035) 0:01:37.670 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.033) 0:01:37.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.030) 0:01:37.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.034) 0:01:37.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.032) 0:01:37.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.033) 0:01:37.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.031) 0:01:37.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.030) 0:01:37.958 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.035) 0:01:37.993 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:20:48 +0000 (0:00:00.037) 0:01:38.031 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.167) 0:01:38.198 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.043) 0:01:38.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.043) 0:01:38.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.031) 0:01:38.317 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.038) 0:01:38.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.029) 0:01:38.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.028) 0:01:38.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.028) 0:01:38.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.028) 0:01:38.469 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.029) 0:01:38.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.047) 0:01:38.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.027) 0:01:38.573 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.038) 0:01:38.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.029) 0:01:38.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.029) 0:01:38.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.030) 0:01:38.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:20:49 +0000 (0:00:00.029) 0:01:38.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.396) 0:01:39.129 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.038) 0:01:39.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.025) 0:01:39.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.035) 0:01:39.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.030) 0:01:39.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.024) 0:01:39.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.033) 0:01:39.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.029) 0:01:39.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.028) 0:01:39.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.024) 0:01:39.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.027) 0:01:39.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.028) 0:01:39.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.030) 0:01:39.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.028) 0:01:39.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.028) 0:01:39.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.036) 0:01:39.578 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.037) 0:01:39.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.030) 0:01:39.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.032) 0:01:39.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.029) 0:01:39.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.030) 0:01:39.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.031) 0:01:39.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.030) 0:01:39.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.032) 0:01:39.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.034) 0:01:39.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.032) 0:01:39.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.031) 0:01:39.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.031) 0:01:39.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.028) 0:01:39.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.031) 0:01:40.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:20:50 +0000 (0:00:00.035) 0:01:40.058 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.036) 0:01:40.095 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.032) 0:01:40.127 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.032) 0:01:40.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.033) 0:01:40.194 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.032) 0:01:40.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.035) 0:01:40.262 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.036) 0:01:40.298 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.035) 0:01:40.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.032) 0:01:40.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.032) 0:01:40.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.031) 0:01:40.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.035) 0:01:40.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.080) 0:01:40.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.033) 0:01:40.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.033) 0:01:40.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.030) 0:01:40.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.029) 0:01:40.672 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.029) 0:01:40.702 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.030) 0:01:40.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:110 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.030) 0:01:40.762 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.092) 0:01:40.855 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:20:51 +0000 (0:00:00.046) 0:01:40.901 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.531) 0:01:41.433 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.081) 0:01:41.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.035) 0:01:41.551 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.031) 0:01:41.582 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.069) 0:01:41.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.036) 0:01:41.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.033) 0:01:41.722 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.038) 0:01:41.760 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.034) 0:01:41.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.032) 0:01:41.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.032) 0:01:41.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.035) 0:01:41.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.032) 0:01:41.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.052) 0:01:41.979 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:20:52 +0000 (0:00:00.032) 0:01:42.011 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:20:54 +0000 (0:00:01.909) 0:01:43.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:20:54 +0000 (0:00:00.032) 0:01:43.953 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:20:54 +0000 (0:00:00.029) 0:01:43.982 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:20:54 +0000 (0:00:00.038) 0:01:44.021 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:20:54 +0000 (0:00:00.037) 0:01:44.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:20:55 +0000 (0:00:00.036) 0:01:44.094 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:20:55 +0000 (0:00:00.421) 0:01:44.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:20:56 +0000 (0:00:00.670) 0:01:45.186 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:20:56 +0000 (0:00:00.030) 0:01:45.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:20:56 +0000 (0:00:00.695) 0:01:45.912 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:20:57 +0000 (0:00:00.405) 0:01:46.318 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:20:57 +0000 (0:00:00.030) 0:01:46.349 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=720 changed=6 unreachable=0 failed=0 skipped=687 rescued=0 ignored=0 Wednesday 01 June 2022 17:20:58 +0000 (0:00:00.878) 0:01:47.228 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.39s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.24s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.62s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.27s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : set up new/current mounts ------------------ 1.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes_scsi_generated.yml:3 linux-system-roles.storage : set up new/current mounts ------------------ 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : make sure blivet is available -------------- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.78s /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml:2 -------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:20:58 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:21:00 +0000 (0:00:01.306) 0:00:01.329 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.31s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_one_disk_one_volume.yml ************************************ 1 plays in /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:2 Wednesday 01 June 2022 17:21:00 +0000 (0:00:00.013) 0:00:01.343 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:12 Wednesday 01 June 2022 17:21:01 +0000 (0:00:01.093) 0:00:02.436 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:21:01 +0000 (0:00:00.037) 0:00:02.473 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:21:01 +0000 (0:00:00.153) 0:00:02.626 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:21:02 +0000 (0:00:00.532) 0:00:03.159 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:21:02 +0000 (0:00:00.071) 0:00:03.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:21:02 +0000 (0:00:00.020) 0:00:03.252 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:21:02 +0000 (0:00:00.022) 0:00:03.274 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:21:02 +0000 (0:00:00.193) 0:00:03.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:21:02 +0000 (0:00:00.018) 0:00:03.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:21:03 +0000 (0:00:01.130) 0:00:04.617 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:21:03 +0000 (0:00:00.046) 0:00:04.663 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:21:03 +0000 (0:00:00.044) 0:00:04.708 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:21:04 +0000 (0:00:00.675) 0:00:05.383 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:21:04 +0000 (0:00:00.080) 0:00:05.464 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:21:04 +0000 (0:00:00.020) 0:00:05.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:21:04 +0000 (0:00:00.021) 0:00:05.506 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:21:04 +0000 (0:00:00.021) 0:00:05.528 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:21:05 +0000 (0:00:00.809) 0:00:06.337 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:21:07 +0000 (0:00:01.793) 0:00:08.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.043) 0:00:08.174 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.027) 0:00:08.202 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.512) 0:00:08.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.029) 0:00:08.743 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.025) 0:00:08.769 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.031) 0:00:08.801 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.030) 0:00:08.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.031) 0:00:08.863 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.064) 0:00:08.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.030) 0:00:08.959 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.033) 0:00:08.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:21:07 +0000 (0:00:00.030) 0:00:09.022 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:21:08 +0000 (0:00:00.455) 0:00:09.478 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:21:08 +0000 (0:00:00.027) 0:00:09.505 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:15 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.876) 0:00:10.382 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:22 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.030) 0:00:10.412 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.043) 0:00:10.456 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.471) 0:00:10.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.034) 0:00:10.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.029) 0:00:10.992 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create one LVM logical volume under one volume group] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:27 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.031) 0:00:11.023 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:21:09 +0000 (0:00:00.053) 0:00:11.077 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.044) 0:00:11.121 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.590) 0:00:11.712 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.066) 0:00:11.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.031) 0:00:11.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.030) 0:00:11.840 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.060) 0:00:11.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.025) 0:00:11.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.029) 0:00:11.956 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.034) 0:00:11.990 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.030) 0:00:12.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.028) 0:00:12.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:21:10 +0000 (0:00:00.028) 0:00:12.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:21:11 +0000 (0:00:00.028) 0:00:12.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:21:11 +0000 (0:00:00.060) 0:00:12.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:21:11 +0000 (0:00:00.042) 0:00:12.210 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:21:11 +0000 (0:00:00.027) 0:00:12.237 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:21:12 +0000 (0:00:01.715) 0:00:13.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:21:12 +0000 (0:00:00.033) 0:00:13.986 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:21:12 +0000 (0:00:00.031) 0:00:14.018 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:21:12 +0000 (0:00:00.041) 0:00:14.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:21:12 +0000 (0:00:00.036) 0:00:14.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:21:13 +0000 (0:00:00.034) 0:00:14.130 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:21:13 +0000 (0:00:00.028) 0:00:14.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:21:13 +0000 (0:00:00.889) 0:00:15.048 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:21:14 +0000 (0:00:00.527) 0:00:15.576 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:21:15 +0000 (0:00:00.636) 0:00:16.212 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:21:15 +0000 (0:00:00.381) 0:00:16.594 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:21:15 +0000 (0:00:00.029) 0:00:16.624 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:39 Wednesday 01 June 2022 17:21:16 +0000 (0:00:00.873) 0:00:17.497 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:21:16 +0000 (0:00:00.052) 0:00:17.550 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:21:16 +0000 (0:00:00.040) 0:00:17.591 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:21:16 +0000 (0:00:00.027) 0:00:17.618 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "d36b11a7-30cb-44c2-9943-09d28307d517" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "v95NOD-TxxI-d2No-ajhq-kwIc-eCI4-yXI10B" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:21:17 +0000 (0:00:00.532) 0:00:18.151 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002975", "end": "2022-06-01 13:21:16.907108", "rc": 0, "start": "2022-06-01 13:21:16.904133" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:21:17 +0000 (0:00:00.522) 0:00:18.673 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002852", "end": "2022-06-01 13:21:17.266465", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:21:17.263613" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:21:17 +0000 (0:00:00.364) 0:00:19.037 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:21:17 +0000 (0:00:00.063) 0:00:19.101 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.030) 0:00:19.131 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.062) 0:00:19.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.039) 0:00:19.234 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.474) 0:00:19.708 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.040) 0:00:19.748 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.035) 0:00:19.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.035) 0:00:19.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.033) 0:00:19.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.030) 0:00:19.884 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.040) 0:00:19.925 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.053) 0:00:19.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.030) 0:00:20.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.030) 0:00:20.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.027) 0:00:20.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:21:18 +0000 (0:00:00.030) 0:00:20.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.029) 0:00:20.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.027) 0:00:20.155 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.030) 0:00:20.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.038) 0:00:20.223 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.060) 0:00:20.284 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.063) 0:00:20.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.028) 0:00:20.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.030) 0:00:20.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.029) 0:00:20.436 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.063) 0:00:20.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.034) 0:00:20.534 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.033) 0:00:20.568 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.056) 0:00:20.624 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.033) 0:00:20.658 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.035) 0:00:20.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.029) 0:00:20.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.028) 0:00:20.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.028) 0:00:20.780 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.029) 0:00:20.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.030) 0:00:20.840 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.062) 0:00:20.902 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.063) 0:00:20.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.082) 0:00:21.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:21:19 +0000 (0:00:00.032) 0:00:21.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.031) 0:00:21.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.029) 0:00:21.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.029) 0:00:21.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.029) 0:00:21.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.031) 0:00:21.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.030) 0:00:21.263 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.029) 0:00:21.292 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.057) 0:00:21.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.035) 0:00:21.385 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.123) 0:00:21.508 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.034) 0:00:21.543 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "d36b11a7-30cb-44c2-9943-09d28307d517" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "d36b11a7-30cb-44c2-9943-09d28307d517" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.040) 0:00:21.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.036) 0:00:21.620 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.037) 0:00:21.658 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.038) 0:00:21.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.030) 0:00:21.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.029) 0:00:21.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.029) 0:00:21.786 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.031) 0:00:21.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.048) 0:00:21.866 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.032) 0:00:21.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.036) 0:00:21.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.030) 0:00:21.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.030) 0:00:21.997 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.036) 0:00:22.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:21:20 +0000 (0:00:00.046) 0:00:22.080 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104072.1211214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104072.1211214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18028, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104072.1211214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.375) 0:00:22.456 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.040) 0:00:22.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.036) 0:00:22.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.034) 0:00:22.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.031) 0:00:22.599 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.036) 0:00:22.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.035) 0:00:22.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.031) 0:00:22.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.031) 0:00:22.733 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.038) 0:00:22.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.029) 0:00:22.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.031) 0:00:22.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.033) 0:00:22.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.030) 0:00:22.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.030) 0:00:22.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.036) 0:00:22.964 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.034) 0:00:22.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.030) 0:00:23.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.030) 0:00:23.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:21:21 +0000 (0:00:00.030) 0:00:23.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.031) 0:00:23.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.029) 0:00:23.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.030) 0:00:23.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.030) 0:00:23.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.033) 0:00:23.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.076) 0:00:23.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.031) 0:00:23.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.030) 0:00:23.381 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:21:22 +0000 (0:00:00.480) 0:00:23.862 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.386) 0:00:24.249 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.038) 0:00:24.287 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.033) 0:00:24.321 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.032) 0:00:24.353 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.031) 0:00:24.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.030) 0:00:24.415 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.028) 0:00:24.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.027) 0:00:24.472 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.032) 0:00:24.505 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.035) 0:00:24.540 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.037) 0:00:24.578 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036905", "end": "2022-06-01 13:21:23.212679", "rc": 0, "start": "2022-06-01 13:21:23.175774" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.415) 0:00:24.993 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.036) 0:00:25.030 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.039) 0:00:25.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:21:23 +0000 (0:00:00.032) 0:00:25.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.033) 0:00:25.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.036) 0:00:25.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.038) 0:00:25.210 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.030) 0:00:25.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.030) 0:00:25.271 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.027) 0:00:25.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:41 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.029) 0:00:25.328 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.059) 0:00:25.387 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.044) 0:00:25.432 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.551) 0:00:25.984 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.072) 0:00:26.056 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:21:24 +0000 (0:00:00.031) 0:00:26.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.030) 0:00:26.118 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.061) 0:00:26.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.025) 0:00:26.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.030) 0:00:26.234 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.045) 0:00:26.280 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.041) 0:00:26.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.041) 0:00:26.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.077) 0:00:26.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.031) 0:00:26.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.030) 0:00:26.502 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.044) 0:00:26.546 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:21:25 +0000 (0:00:00.028) 0:00:26.575 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:21:26 +0000 (0:00:01.370) 0:00:27.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:21:26 +0000 (0:00:00.031) 0:00:27.977 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:21:26 +0000 (0:00:00.028) 0:00:28.006 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:21:26 +0000 (0:00:00.039) 0:00:28.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:21:26 +0000 (0:00:00.036) 0:00:28.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:21:27 +0000 (0:00:00.033) 0:00:28.116 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:21:27 +0000 (0:00:00.031) 0:00:28.148 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:21:27 +0000 (0:00:00.677) 0:00:28.825 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:21:28 +0000 (0:00:00.402) 0:00:29.228 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:21:28 +0000 (0:00:00.636) 0:00:29.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:21:29 +0000 (0:00:00.381) 0:00:30.246 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:21:29 +0000 (0:00:00.029) 0:00:30.275 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:53 Wednesday 01 June 2022 17:21:29 +0000 (0:00:00.819) 0:00:31.095 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:21:30 +0000 (0:00:00.058) 0:00:31.154 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:21:30 +0000 (0:00:00.075) 0:00:31.229 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:21:30 +0000 (0:00:00.031) 0:00:31.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "d36b11a7-30cb-44c2-9943-09d28307d517" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "v95NOD-TxxI-d2No-ajhq-kwIc-eCI4-yXI10B" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:21:30 +0000 (0:00:00.377) 0:00:31.639 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002623", "end": "2022-06-01 13:21:30.237540", "rc": 0, "start": "2022-06-01 13:21:30.234917" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:21:30 +0000 (0:00:00.363) 0:00:32.002 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002634", "end": "2022-06-01 13:21:30.605629", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:21:30.602995" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.369) 0:00:32.372 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.064) 0:00:32.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.032) 0:00:32.469 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.077) 0:00:32.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.042) 0:00:32.589 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.395) 0:00:32.984 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.043) 0:00:33.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:21:31 +0000 (0:00:00.041) 0:00:33.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.037) 0:00:33.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.038) 0:00:33.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.031) 0:00:33.176 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.043) 0:00:33.219 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.054) 0:00:33.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.029) 0:00:33.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.027) 0:00:33.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.027) 0:00:33.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.028) 0:00:33.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.032) 0:00:33.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.029) 0:00:33.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.033) 0:00:33.481 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.031) 0:00:33.513 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.058) 0:00:33.572 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.065) 0:00:33.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.035) 0:00:33.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.035) 0:00:33.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.048) 0:00:33.757 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.068) 0:00:33.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.034) 0:00:33.860 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.032) 0:00:33.893 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.054) 0:00:33.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:21:32 +0000 (0:00:00.039) 0:00:33.987 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.119) 0:00:34.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.047) 0:00:34.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.039) 0:00:34.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.032) 0:00:34.226 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.033) 0:00:34.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.033) 0:00:34.293 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.069) 0:00:34.362 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.067) 0:00:34.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.034) 0:00:34.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.031) 0:00:34.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.031) 0:00:34.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.030) 0:00:34.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.031) 0:00:34.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.030) 0:00:34.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.033) 0:00:34.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.031) 0:00:34.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.031) 0:00:34.714 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.061) 0:00:34.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.040) 0:00:34.816 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.125) 0:00:34.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.036) 0:00:34.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "d36b11a7-30cb-44c2-9943-09d28307d517" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "d36b11a7-30cb-44c2-9943-09d28307d517" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.047) 0:00:35.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:21:33 +0000 (0:00:00.039) 0:00:35.066 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.038) 0:00:35.105 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.037) 0:00:35.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.038) 0:00:35.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.032) 0:00:35.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.030) 0:00:35.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.031) 0:00:35.274 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.048) 0:00:35.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.035) 0:00:35.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.037) 0:00:35.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.031) 0:00:35.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.031) 0:00:35.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.037) 0:00:35.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.038) 0:00:35.535 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104072.1211214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104072.1211214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18028, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104072.1211214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.393) 0:00:35.929 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.038) 0:00:35.967 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.036) 0:00:36.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.038) 0:00:36.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:21:34 +0000 (0:00:00.035) 0:00:36.078 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.037) 0:00:36.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.035) 0:00:36.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.219 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.041) 0:00:36.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.036) 0:00:36.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.090) 0:00:36.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.040) 0:00:36.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.036) 0:00:36.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.031) 0:00:36.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.031) 0:00:36.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.662 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.032) 0:00:36.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.032) 0:00:36.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.034) 0:00:36.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.033) 0:00:36.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:21:35 +0000 (0:00:00.032) 0:00:36.928 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.380) 0:00:37.309 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.397) 0:00:37.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.038) 0:00:37.745 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.035) 0:00:37.780 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.032) 0:00:37.812 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.033) 0:00:37.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.032) 0:00:37.878 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.032) 0:00:37.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.030) 0:00:37.942 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.031) 0:00:37.974 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.030) 0:00:38.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:21:36 +0000 (0:00:00.039) 0:00:38.044 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.039770", "end": "2022-06-01 13:21:36.694600", "rc": 0, "start": "2022-06-01 13:21:36.654830" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.423) 0:00:38.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.038) 0:00:38.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.043) 0:00:38.550 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.030) 0:00:38.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.030) 0:00:38.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.030) 0:00:38.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.033) 0:00:38.675 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.032) 0:00:38.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.032) 0:00:38.740 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.030) 0:00:38.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:55 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.031) 0:00:38.801 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.072) 0:00:38.874 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:21:37 +0000 (0:00:00.043) 0:00:38.918 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.551) 0:00:39.469 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.131) 0:00:39.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.032) 0:00:39.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.032) 0:00:39.665 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.064) 0:00:39.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.026) 0:00:39.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.030) 0:00:39.787 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.035) 0:00:39.823 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.031) 0:00:39.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.032) 0:00:39.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.034) 0:00:39.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.030) 0:00:39.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.031) 0:00:39.983 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.060) 0:00:40.044 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:21:38 +0000 (0:00:00.042) 0:00:40.087 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:21:40 +0000 (0:00:01.959) 0:00:42.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:21:40 +0000 (0:00:00.030) 0:00:42.077 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:21:41 +0000 (0:00:00.029) 0:00:42.106 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:21:41 +0000 (0:00:00.037) 0:00:42.144 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:21:41 +0000 (0:00:00.038) 0:00:42.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:21:41 +0000 (0:00:00.035) 0:00:42.218 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:21:41 +0000 (0:00:00.403) 0:00:42.622 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:21:42 +0000 (0:00:00.685) 0:00:43.307 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:21:42 +0000 (0:00:00.032) 0:00:43.340 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:21:43 +0000 (0:00:00.898) 0:00:44.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:21:43 +0000 (0:00:00.398) 0:00:44.637 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:21:43 +0000 (0:00:00.038) 0:00:44.675 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:69 Wednesday 01 June 2022 17:21:44 +0000 (0:00:00.874) 0:00:45.550 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:21:44 +0000 (0:00:00.060) 0:00:45.611 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:21:44 +0000 (0:00:00.084) 0:00:45.695 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:21:44 +0000 (0:00:00.029) 0:00:45.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.386) 0:00:46.111 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002925", "end": "2022-06-01 13:21:44.729707", "rc": 0, "start": "2022-06-01 13:21:44.726782" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.386) 0:00:46.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002715", "end": "2022-06-01 13:21:45.102206", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:21:45.099491" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.372) 0:00:46.870 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.065) 0:00:46.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.033) 0:00:46.969 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.064) 0:00:47.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.040) 0:00:47.074 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:21:45 +0000 (0:00:00.026) 0:00:47.100 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.025) 0:00:47.125 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.035) 0:00:47.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.032) 0:00:47.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.036) 0:00:47.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.027) 0:00:47.257 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.025) 0:00:47.282 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.052) 0:00:47.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.032) 0:00:47.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.030) 0:00:47.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.029) 0:00:47.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.029) 0:00:47.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.029) 0:00:47.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.030) 0:00:47.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.031) 0:00:47.547 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.029) 0:00:47.576 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.057) 0:00:47.634 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.060) 0:00:47.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.032) 0:00:47.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.029) 0:00:47.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.031) 0:00:47.788 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.069) 0:00:47.857 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.041) 0:00:47.899 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.029) 0:00:47.929 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.029) 0:00:47.958 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.031) 0:00:47.990 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:21:46 +0000 (0:00:00.065) 0:00:48.055 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.068) 0:00:48.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.028) 0:00:48.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.027) 0:00:48.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.026) 0:00:48.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.076) 0:00:48.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.030) 0:00:48.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.027) 0:00:48.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.027) 0:00:48.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.030) 0:00:48.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.029) 0:00:48.430 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.061) 0:00:48.491 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.035) 0:00:48.527 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.117) 0:00:48.644 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.033) 0:00:48.677 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.041) 0:00:48.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.030) 0:00:48.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.039) 0:00:48.789 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.031) 0:00:48.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.029) 0:00:48.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.027) 0:00:48.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.031) 0:00:48.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.031) 0:00:48.941 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.052) 0:00:48.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.030) 0:00:49.024 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.042) 0:00:49.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:21:47 +0000 (0:00:00.030) 0:00:49.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.030) 0:00:49.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.031) 0:00:49.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.029) 0:00:49.189 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.392) 0:00:49.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.042) 0:00:49.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.027) 0:00:49.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.035) 0:00:49.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.031) 0:00:49.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.027) 0:00:49.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.031) 0:00:49.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.033) 0:00:49.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.031) 0:00:49.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.026) 0:00:49.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.031) 0:00:49.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.032) 0:00:49.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.031) 0:00:49.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.034) 0:00:50.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.032) 0:00:50.032 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:21:48 +0000 (0:00:00.039) 0:00:50.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.037) 0:00:50.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.031) 0:00:50.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.031) 0:00:50.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.033) 0:00:50.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.031) 0:00:50.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.031) 0:00:50.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.031) 0:00:50.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.030) 0:00:50.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.560 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.096) 0:00:50.656 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.032) 0:00:50.688 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.029) 0:00:50.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.029) 0:00:50.747 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.029) 0:00:50.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.029) 0:00:50.805 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.035) 0:00:50.841 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.033) 0:00:50.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.030) 0:00:50.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.030) 0:00:50.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.036) 0:00:50.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.034) 0:00:51.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.037) 0:00:51.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:21:49 +0000 (0:00:00.033) 0:00:51.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:21:50 +0000 (0:00:00.032) 0:00:51.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:21:50 +0000 (0:00:00.034) 0:00:51.144 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:21:50 +0000 (0:00:00.037) 0:00:51.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:21:50 +0000 (0:00:00.032) 0:00:51.214 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:21:50 +0000 (0:00:00.031) 0:00:51.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=284 changed=4 unreachable=0 failed=0 skipped=230 rescued=0 ignored=0 Wednesday 01 June 2022 17:21:50 +0000 (0:00:00.015) 0:00:51.262 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.96s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.31s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.13s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:2 -------------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : get required packages ---------------------- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:21:50 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:21:52 +0000 (0:00:01.326) 0:00:01.350 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_one_disk_one_volume_nvme_generated.yml ********************* 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:21:52 +0000 (0:00:00.016) 0:00:01.366 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:21:53 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:21:54 +0000 (0:00:01.349) 0:00:01.372 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_one_disk_one_volume_scsi_generated.yml ********************* 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume_scsi_generated.yml:3 Wednesday 01 June 2022 17:21:54 +0000 (0:00:00.014) 0:00:01.387 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume_scsi_generated.yml:7 Wednesday 01 June 2022 17:21:55 +0000 (0:00:01.104) 0:00:02.491 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:2 Wednesday 01 June 2022 17:21:55 +0000 (0:00:00.024) 0:00:02.515 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:12 Wednesday 01 June 2022 17:21:56 +0000 (0:00:00.859) 0:00:03.374 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:21:56 +0000 (0:00:00.040) 0:00:03.415 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:21:56 +0000 (0:00:00.161) 0:00:03.576 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:21:57 +0000 (0:00:00.562) 0:00:04.138 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:21:57 +0000 (0:00:00.073) 0:00:04.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:21:57 +0000 (0:00:00.023) 0:00:04.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:21:57 +0000 (0:00:00.021) 0:00:04.258 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:21:57 +0000 (0:00:00.192) 0:00:04.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:21:57 +0000 (0:00:00.020) 0:00:04.470 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:21:58 +0000 (0:00:01.105) 0:00:05.576 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:21:58 +0000 (0:00:00.048) 0:00:05.624 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:21:58 +0000 (0:00:00.048) 0:00:05.673 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:21:59 +0000 (0:00:00.729) 0:00:06.402 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:21:59 +0000 (0:00:00.087) 0:00:06.489 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:21:59 +0000 (0:00:00.022) 0:00:06.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:21:59 +0000 (0:00:00.022) 0:00:06.534 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:21:59 +0000 (0:00:00.022) 0:00:06.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:22:00 +0000 (0:00:00.881) 0:00:07.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:22:02 +0000 (0:00:01.821) 0:00:09.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:22:02 +0000 (0:00:00.043) 0:00:09.302 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:22:02 +0000 (0:00:00.028) 0:00:09.331 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:22:02 +0000 (0:00:00.528) 0:00:09.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:22:02 +0000 (0:00:00.034) 0:00:09.894 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:22:02 +0000 (0:00:00.031) 0:00:09.925 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:22:02 +0000 (0:00:00.036) 0:00:09.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.034) 0:00:09.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.033) 0:00:10.031 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.030) 0:00:10.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.030) 0:00:10.092 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.032) 0:00:10.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.030) 0:00:10.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.511) 0:00:10.667 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:22:03 +0000 (0:00:00.032) 0:00:10.699 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:15 Wednesday 01 June 2022 17:22:04 +0000 (0:00:00.849) 0:00:11.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:22 Wednesday 01 June 2022 17:22:04 +0000 (0:00:00.032) 0:00:11.581 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:22:04 +0000 (0:00:00.045) 0:00:11.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:22:05 +0000 (0:00:00.529) 0:00:12.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:22:05 +0000 (0:00:00.035) 0:00:12.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:22:05 +0000 (0:00:00.029) 0:00:12.221 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create one LVM logical volume under one volume group] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:27 Wednesday 01 June 2022 17:22:05 +0000 (0:00:00.072) 0:00:12.293 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:22:05 +0000 (0:00:00.054) 0:00:12.348 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:22:05 +0000 (0:00:00.043) 0:00:12.391 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:22:05 +0000 (0:00:00.529) 0:00:12.921 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.070) 0:00:12.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.029) 0:00:13.021 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.031) 0:00:13.052 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.062) 0:00:13.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.026) 0:00:13.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.030) 0:00:13.172 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.036) 0:00:13.209 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.033) 0:00:13.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.028) 0:00:13.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.027) 0:00:13.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.028) 0:00:13.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.028) 0:00:13.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.041) 0:00:13.398 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:22:06 +0000 (0:00:00.028) 0:00:13.426 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:22:08 +0000 (0:00:01.826) 0:00:15.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:22:08 +0000 (0:00:00.035) 0:00:15.288 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:22:08 +0000 (0:00:00.034) 0:00:15.322 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:22:08 +0000 (0:00:00.041) 0:00:15.363 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:22:08 +0000 (0:00:00.038) 0:00:15.402 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:22:08 +0000 (0:00:00.037) 0:00:15.439 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:22:08 +0000 (0:00:00.031) 0:00:15.470 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:22:09 +0000 (0:00:00.996) 0:00:16.466 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:22:10 +0000 (0:00:00.551) 0:00:17.017 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:22:10 +0000 (0:00:00.690) 0:00:17.708 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:22:11 +0000 (0:00:00.380) 0:00:18.089 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:22:11 +0000 (0:00:00.027) 0:00:18.116 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:39 Wednesday 01 June 2022 17:22:12 +0000 (0:00:00.865) 0:00:18.982 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:22:12 +0000 (0:00:00.053) 0:00:19.036 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:22:12 +0000 (0:00:00.040) 0:00:19.076 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:22:12 +0000 (0:00:00.031) 0:00:19.107 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "94742250-de6f-4267-abf5-a3017331adc4" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "PDIrR8-M9ii-1uP1-0H0F-X2xu-1JM2-HolOcx" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:22:12 +0000 (0:00:00.502) 0:00:19.610 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002877", "end": "2022-06-01 13:22:12.499656", "rc": 0, "start": "2022-06-01 13:22:12.496779" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:22:13 +0000 (0:00:00.529) 0:00:20.139 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002779", "end": "2022-06-01 13:22:12.880764", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:22:12.877985" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:22:13 +0000 (0:00:00.377) 0:00:20.517 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:22:13 +0000 (0:00:00.063) 0:00:20.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:22:13 +0000 (0:00:00.031) 0:00:20.612 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:22:13 +0000 (0:00:00.067) 0:00:20.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:22:13 +0000 (0:00:00.040) 0:00:20.720 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.538) 0:00:21.258 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.042) 0:00:21.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.038) 0:00:21.339 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.073) 0:00:21.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.045) 0:00:21.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.037) 0:00:21.495 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.046) 0:00:21.542 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.063) 0:00:21.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.032) 0:00:21.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.032) 0:00:21.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.033) 0:00:21.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.031) 0:00:21.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.033) 0:00:21.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.031) 0:00:21.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.032) 0:00:21.831 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.031) 0:00:21.862 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:22:14 +0000 (0:00:00.059) 0:00:21.922 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.071) 0:00:21.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.032) 0:00:22.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.031) 0:00:22.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.030) 0:00:22.087 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.064) 0:00:22.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.036) 0:00:22.188 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.035) 0:00:22.224 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.064) 0:00:22.288 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.038) 0:00:22.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.037) 0:00:22.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.029) 0:00:22.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.028) 0:00:22.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.027) 0:00:22.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.033) 0:00:22.484 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.029) 0:00:22.513 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.061) 0:00:22.575 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.071) 0:00:22.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.033) 0:00:22.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.029) 0:00:22.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.029) 0:00:22.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.031) 0:00:22.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.034) 0:00:22.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.031) 0:00:22.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.046) 0:00:22.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.037) 0:00:22.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:22:15 +0000 (0:00:00.033) 0:00:22.954 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.069) 0:00:23.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.045) 0:00:23.068 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.133) 0:00:23.202 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.038) 0:00:23.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "94742250-de6f-4267-abf5-a3017331adc4" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "94742250-de6f-4267-abf5-a3017331adc4" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.106) 0:00:23.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.040) 0:00:23.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.040) 0:00:23.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.040) 0:00:23.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.032) 0:00:23.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.032) 0:00:23.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.034) 0:00:23.569 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.033) 0:00:23.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.051) 0:00:23.653 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.034) 0:00:23.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.037) 0:00:23.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.036) 0:00:23.762 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.032) 0:00:23.794 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.039) 0:00:23.834 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:22:16 +0000 (0:00:00.036) 0:00:23.871 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104127.5161216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104127.5161216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18189, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104127.5161216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.409) 0:00:24.281 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.036) 0:00:24.317 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.033) 0:00:24.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.035) 0:00:24.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.031) 0:00:24.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.036) 0:00:24.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.031) 0:00:24.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.032) 0:00:24.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.032) 0:00:24.550 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.043) 0:00:24.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.033) 0:00:24.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.030) 0:00:24.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.030) 0:00:24.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.029) 0:00:24.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.029) 0:00:24.747 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.040) 0:00:24.787 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.041) 0:00:24.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.037) 0:00:24.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.032) 0:00:24.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.030) 0:00:24.929 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:22:17 +0000 (0:00:00.036) 0:00:24.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.035) 0:00:25.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.032) 0:00:25.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.032) 0:00:25.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.032) 0:00:25.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.032) 0:00:25.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.032) 0:00:25.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.035) 0:00:25.199 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:22:18 +0000 (0:00:00.531) 0:00:25.731 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.406) 0:00:26.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.038) 0:00:26.176 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.035) 0:00:26.211 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.032) 0:00:26.244 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.033) 0:00:26.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.074) 0:00:26.352 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.035) 0:00:26.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.031) 0:00:26.419 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.039) 0:00:26.458 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.034) 0:00:26.493 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:22:19 +0000 (0:00:00.044) 0:00:26.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.045721", "end": "2022-06-01 13:22:19.331684", "rc": 0, "start": "2022-06-01 13:22:19.285963" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.436) 0:00:26.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.042) 0:00:27.016 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.040) 0:00:27.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.033) 0:00:27.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.034) 0:00:27.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.040) 0:00:27.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.037) 0:00:27.204 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.031) 0:00:27.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.031) 0:00:27.267 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.027) 0:00:27.294 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:41 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.029) 0:00:27.324 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.061) 0:00:27.385 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.046) 0:00:27.431 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:22:20 +0000 (0:00:00.538) 0:00:27.970 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.074) 0:00:28.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.030) 0:00:28.075 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.030) 0:00:28.105 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.067) 0:00:28.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.026) 0:00:28.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.031) 0:00:28.230 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.039) 0:00:28.270 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.034) 0:00:28.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.032) 0:00:28.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.035) 0:00:28.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.032) 0:00:28.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.032) 0:00:28.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.047) 0:00:28.485 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:22:21 +0000 (0:00:00.029) 0:00:28.514 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:22:22 +0000 (0:00:01.359) 0:00:29.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:22:22 +0000 (0:00:00.037) 0:00:29.911 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:22:23 +0000 (0:00:00.089) 0:00:30.001 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:22:23 +0000 (0:00:00.044) 0:00:30.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:22:23 +0000 (0:00:00.039) 0:00:30.085 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:22:23 +0000 (0:00:00.036) 0:00:30.121 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:22:23 +0000 (0:00:00.030) 0:00:30.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:22:23 +0000 (0:00:00.674) 0:00:30.827 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:22:24 +0000 (0:00:00.402) 0:00:31.229 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:22:24 +0000 (0:00:00.660) 0:00:31.889 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:22:25 +0000 (0:00:00.396) 0:00:32.285 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:22:25 +0000 (0:00:00.031) 0:00:32.316 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:53 Wednesday 01 June 2022 17:22:26 +0000 (0:00:00.868) 0:00:33.185 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:22:26 +0000 (0:00:00.057) 0:00:33.243 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:22:26 +0000 (0:00:00.039) 0:00:33.282 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:22:26 +0000 (0:00:00.028) 0:00:33.310 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "94742250-de6f-4267-abf5-a3017331adc4" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "PDIrR8-M9ii-1uP1-0H0F-X2xu-1JM2-HolOcx" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:22:26 +0000 (0:00:00.388) 0:00:33.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002819", "end": "2022-06-01 13:22:26.444495", "rc": 0, "start": "2022-06-01 13:22:26.441676" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:22:27 +0000 (0:00:00.384) 0:00:34.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002823", "end": "2022-06-01 13:22:26.837533", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:22:26.834710" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:22:27 +0000 (0:00:00.393) 0:00:34.478 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:22:27 +0000 (0:00:00.064) 0:00:34.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:22:27 +0000 (0:00:00.034) 0:00:34.577 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:22:27 +0000 (0:00:00.103) 0:00:34.681 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:22:27 +0000 (0:00:00.041) 0:00:34.722 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.403) 0:00:35.125 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.040) 0:00:35.166 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.036) 0:00:35.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.039) 0:00:35.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.038) 0:00:35.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.032) 0:00:35.314 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.048) 0:00:35.363 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.063) 0:00:35.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.031) 0:00:35.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.029) 0:00:35.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.029) 0:00:35.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.029) 0:00:35.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.029) 0:00:35.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.032) 0:00:35.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.031) 0:00:35.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.030) 0:00:35.669 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.059) 0:00:35.728 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.068) 0:00:35.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.033) 0:00:35.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.034) 0:00:35.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.031) 0:00:35.897 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:22:28 +0000 (0:00:00.065) 0:00:35.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.037) 0:00:36.000 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.037) 0:00:36.037 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.058) 0:00:36.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.037) 0:00:36.134 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.035) 0:00:36.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.028) 0:00:36.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.028) 0:00:36.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.029) 0:00:36.255 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.029) 0:00:36.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.031) 0:00:36.316 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.060) 0:00:36.377 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.064) 0:00:36.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.032) 0:00:36.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.029) 0:00:36.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.029) 0:00:36.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.029) 0:00:36.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.029) 0:00:36.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.031) 0:00:36.623 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.029) 0:00:36.653 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.033) 0:00:36.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.034) 0:00:36.721 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.061) 0:00:36.782 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:22:29 +0000 (0:00:00.102) 0:00:36.885 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.135) 0:00:37.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.038) 0:00:37.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "94742250-de6f-4267-abf5-a3017331adc4" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030454, "block_size": 4096, "block_total": 1046016, "block_used": 15562, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 4220739584, "size_total": 4284481536, "uuid": "94742250-de6f-4267-abf5-a3017331adc4" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.046) 0:00:37.105 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.040) 0:00:37.145 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.038) 0:00:37.183 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.037) 0:00:37.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.031) 0:00:37.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.032) 0:00:37.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.033) 0:00:37.318 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.035) 0:00:37.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.052) 0:00:37.406 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.036) 0:00:37.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.038) 0:00:37.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.032) 0:00:37.514 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.034) 0:00:37.548 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.038) 0:00:37.587 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:22:30 +0000 (0:00:00.039) 0:00:37.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104127.5161216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104127.5161216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18189, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104127.5161216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.416) 0:00:38.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.036) 0:00:38.079 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.033) 0:00:38.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.033) 0:00:38.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.029) 0:00:38.176 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.036) 0:00:38.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.035) 0:00:38.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.032) 0:00:38.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.030) 0:00:38.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.040) 0:00:38.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.032) 0:00:38.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.030) 0:00:38.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.035) 0:00:38.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.032) 0:00:38.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.029) 0:00:38.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.039) 0:00:38.551 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.038) 0:00:38.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.035) 0:00:38.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.036) 0:00:38.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.032) 0:00:38.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.033) 0:00:38.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.033) 0:00:38.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.033) 0:00:38.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.032) 0:00:38.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.036) 0:00:38.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.033) 0:00:38.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.032) 0:00:38.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:22:31 +0000 (0:00:00.032) 0:00:38.963 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:22:32 +0000 (0:00:00.386) 0:00:39.349 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:22:32 +0000 (0:00:00.401) 0:00:39.750 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:22:32 +0000 (0:00:00.037) 0:00:39.788 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:22:32 +0000 (0:00:00.034) 0:00:39.823 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:22:32 +0000 (0:00:00.036) 0:00:39.860 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:22:32 +0000 (0:00:00.038) 0:00:39.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:22:32 +0000 (0:00:00.038) 0:00:39.937 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.039) 0:00:39.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.038) 0:00:40.015 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.037) 0:00:40.053 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.034) 0:00:40.087 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.041) 0:00:40.129 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.035589", "end": "2022-06-01 13:22:32.928706", "rc": 0, "start": "2022-06-01 13:22:32.893117" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.439) 0:00:40.568 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.039) 0:00:40.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.061) 0:00:40.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.034) 0:00:40.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.032) 0:00:40.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.030) 0:00:40.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.034) 0:00:40.802 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.030) 0:00:40.832 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.029) 0:00:40.862 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.027) 0:00:40.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:55 Wednesday 01 June 2022 17:22:33 +0000 (0:00:00.029) 0:00:40.919 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.071) 0:00:40.990 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.042) 0:00:41.033 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.524) 0:00:41.558 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.077) 0:00:41.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.032) 0:00:41.667 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.033) 0:00:41.700 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.063) 0:00:41.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.026) 0:00:41.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.035) 0:00:41.827 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.047) 0:00:41.875 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.035) 0:00:41.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:22:34 +0000 (0:00:00.038) 0:00:41.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:22:35 +0000 (0:00:00.032) 0:00:41.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:22:35 +0000 (0:00:00.031) 0:00:42.013 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:22:35 +0000 (0:00:00.033) 0:00:42.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:22:35 +0000 (0:00:00.049) 0:00:42.095 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:22:35 +0000 (0:00:00.030) 0:00:42.125 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:22:37 +0000 (0:00:01.981) 0:00:44.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:22:37 +0000 (0:00:00.032) 0:00:44.139 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:22:37 +0000 (0:00:00.028) 0:00:44.168 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:22:37 +0000 (0:00:00.038) 0:00:44.207 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:22:37 +0000 (0:00:00.041) 0:00:44.248 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:22:37 +0000 (0:00:00.090) 0:00:44.339 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:22:37 +0000 (0:00:00.403) 0:00:44.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:22:38 +0000 (0:00:00.705) 0:00:45.448 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:22:38 +0000 (0:00:00.032) 0:00:45.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:22:39 +0000 (0:00:00.642) 0:00:46.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:22:39 +0000 (0:00:00.381) 0:00:46.505 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:22:39 +0000 (0:00:00.030) 0:00:46.536 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:69 Wednesday 01 June 2022 17:22:40 +0000 (0:00:00.861) 0:00:47.397 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:22:40 +0000 (0:00:00.061) 0:00:47.458 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "absent", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:22:40 +0000 (0:00:00.039) 0:00:47.498 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:22:40 +0000 (0:00:00.033) 0:00:47.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:22:40 +0000 (0:00:00.395) 0:00:47.927 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002888", "end": "2022-06-01 13:22:40.683438", "rc": 0, "start": "2022-06-01 13:22:40.680550" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:22:41 +0000 (0:00:00.394) 0:00:48.322 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002896", "end": "2022-06-01 13:22:41.075663", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:22:41.072767" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:22:41 +0000 (0:00:00.392) 0:00:48.715 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:22:41 +0000 (0:00:00.061) 0:00:48.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:22:41 +0000 (0:00:00.039) 0:00:48.816 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:22:41 +0000 (0:00:00.066) 0:00:48.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:22:41 +0000 (0:00:00.042) 0:00:48.925 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.079) 0:00:49.005 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.027) 0:00:49.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.035) 0:00:49.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.034) 0:00:49.102 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.033) 0:00:49.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.027) 0:00:49.163 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.030) 0:00:49.194 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.051) 0:00:49.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.026) 0:00:49.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.026) 0:00:49.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.027) 0:00:49.326 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.029) 0:00:49.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.031) 0:00:49.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.032) 0:00:49.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.031) 0:00:49.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.030) 0:00:49.482 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.068) 0:00:49.550 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.069) 0:00:49.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.033) 0:00:49.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.031) 0:00:49.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.032) 0:00:49.717 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.062) 0:00:49.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.037) 0:00:49.816 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.030) 0:00:49.847 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.034) 0:00:49.882 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:22:42 +0000 (0:00:00.035) 0:00:49.917 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.063) 0:00:49.981 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.068) 0:00:50.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.031) 0:00:50.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.033) 0:00:50.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.031) 0:00:50.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.028) 0:00:50.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.028) 0:00:50.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.033) 0:00:50.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.028) 0:00:50.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.031) 0:00:50.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.031) 0:00:50.330 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.056) 0:00:50.387 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.039) 0:00:50.426 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.140) 0:00:50.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.039) 0:00:50.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.042) 0:00:50.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.033) 0:00:50.682 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.037) 0:00:50.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.033) 0:00:50.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.031) 0:00:50.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.030) 0:00:50.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.031) 0:00:50.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.033) 0:00:50.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.048) 0:00:50.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:22:43 +0000 (0:00:00.030) 0:00:50.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.094) 0:00:51.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.030) 0:00:51.083 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.029) 0:00:51.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.027) 0:00:51.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.024) 0:00:51.165 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.384) 0:00:51.549 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.039) 0:00:51.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.027) 0:00:51.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.035) 0:00:51.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.030) 0:00:51.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.026) 0:00:51.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.032) 0:00:51.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.030) 0:00:51.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.034) 0:00:51.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.027) 0:00:51.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.031) 0:00:51.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.030) 0:00:51.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.030) 0:00:51.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:22:44 +0000 (0:00:00.030) 0:00:51.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.034) 0:00:51.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.038) 0:00:52.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.036) 0:00:52.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.030) 0:00:52.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.031) 0:00:52.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.033) 0:00:52.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.037) 0:00:52.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.042) 0:00:52.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.033) 0:00:52.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.032) 0:00:52.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.031) 0:00:52.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.033) 0:00:52.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.034) 0:00:52.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.032) 0:00:52.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.033) 0:00:52.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.032) 0:00:52.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.031) 0:00:52.534 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.034) 0:00:52.568 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.035) 0:00:52.603 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.031) 0:00:52.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.034) 0:00:52.670 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.032) 0:00:52.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.032) 0:00:52.735 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.036) 0:00:52.771 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.049) 0:00:52.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.036) 0:00:52.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.033) 0:00:52.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.035) 0:00:52.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:22:45 +0000 (0:00:00.031) 0:00:52.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.032) 0:00:52.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.035) 0:00:53.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.033) 0:00:53.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.032) 0:00:53.092 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.032) 0:00:53.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.032) 0:00:53.157 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.029) 0:00:53.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=286 changed=4 unreachable=0 failed=0 skipped=230 rescued=0 ignored=0 Wednesday 01 June 2022 17:22:46 +0000 (0:00:00.017) 0:00:53.204 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.36s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.10s /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume_scsi_generated.yml:3 ----- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : make sure required packages are installed --- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.86s /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml:2 -------------------- linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.64s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:22:47 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:22:48 +0000 (0:00:01.310) 0:00:01.333 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.31s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_percent_size.yml ******************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:2 Wednesday 01 June 2022 17:22:48 +0000 (0:00:00.022) 0:00:01.355 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:17 Wednesday 01 June 2022 17:22:49 +0000 (0:00:01.078) 0:00:02.434 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:22:49 +0000 (0:00:00.037) 0:00:02.471 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:22:49 +0000 (0:00:00.159) 0:00:02.630 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:22:50 +0000 (0:00:00.535) 0:00:03.165 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:22:50 +0000 (0:00:00.076) 0:00:03.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:22:50 +0000 (0:00:00.022) 0:00:03.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:22:50 +0000 (0:00:00.021) 0:00:03.285 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:22:50 +0000 (0:00:00.193) 0:00:03.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:22:50 +0000 (0:00:00.018) 0:00:03.498 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:22:51 +0000 (0:00:01.117) 0:00:04.615 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:22:51 +0000 (0:00:00.047) 0:00:04.663 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:22:51 +0000 (0:00:00.048) 0:00:04.712 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:22:52 +0000 (0:00:00.701) 0:00:05.413 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:22:52 +0000 (0:00:00.082) 0:00:05.495 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:22:52 +0000 (0:00:00.021) 0:00:05.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:22:52 +0000 (0:00:00.021) 0:00:05.539 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:22:52 +0000 (0:00:00.024) 0:00:05.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:22:53 +0000 (0:00:00.873) 0:00:06.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:22:55 +0000 (0:00:01.869) 0:00:08.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:22:55 +0000 (0:00:00.043) 0:00:08.350 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:22:55 +0000 (0:00:00.028) 0:00:08.379 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:22:55 +0000 (0:00:00.525) 0:00:08.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:22:55 +0000 (0:00:00.029) 0:00:08.934 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:22:55 +0000 (0:00:00.026) 0:00:08.961 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:22:55 +0000 (0:00:00.034) 0:00:08.996 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.032) 0:00:09.028 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.031) 0:00:09.060 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.026) 0:00:09.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.026) 0:00:09.113 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.025) 0:00:09.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.029) 0:00:09.168 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.483) 0:00:09.651 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:22:56 +0000 (0:00:00.028) 0:00:09.680 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:20 Wednesday 01 June 2022 17:22:57 +0000 (0:00:00.831) 0:00:10.511 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:22:57 +0000 (0:00:00.044) 0:00:10.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.538) 0:00:11.094 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.036) 0:00:11.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.030) 0:00:11.161 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Try to create LVM with an invalid size specification.] ******************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:27 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.033) 0:00:11.194 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.050) 0:00:11.244 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.045) 0:00:11.289 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.518) 0:00:11.807 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.069) 0:00:11.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.030) 0:00:11.908 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.030) 0:00:11.939 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:22:58 +0000 (0:00:00.062) 0:00:12.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:22:59 +0000 (0:00:00.027) 0:00:12.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:22:59 +0000 (0:00:00.886) 0:00:12.915 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "2x%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:22:59 +0000 (0:00:00.037) 0:00:12.953 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:22:59 +0000 (0:00:00.033) 0:00:12.986 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:23:00 +0000 (0:00:01.031) 0:00:14.018 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:23:01 +0000 (0:00:00.058) 0:00:14.077 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:23:01 +0000 (0:00:00.028) 0:00:14.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:23:01 +0000 (0:00:00.030) 0:00:14.137 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:23:01 +0000 (0:00:00.029) 0:00:14.166 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:23:02 +0000 (0:00:00.859) 0:00:15.025 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:23:03 +0000 (0:00:01.750) 0:00:16.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:23:03 +0000 (0:00:00.054) 0:00:16.831 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:23:03 +0000 (0:00:00.029) 0:00:16.860 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: invalid percentage '2x%' size specified for volume 'test1' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:23:04 +0000 (0:00:01.076) 0:00:17.937 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'2x%', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"invalid percentage '2x%' size specified for volume 'test1'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:23:04 +0000 (0:00:00.045) 0:00:17.983 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:44 Wednesday 01 June 2022 17:23:04 +0000 (0:00:00.027) 0:00:18.010 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check for the expected error message] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:50 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.033) 0:00:18.044 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create two LVM logical volumes under volume group 'foo' using percentage sizes] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:63 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.032) 0:00:18.076 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.052) 0:00:18.128 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.042) 0:00:18.170 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.545) 0:00:18.716 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.070) 0:00:18.787 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.080) 0:00:18.868 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.032) 0:00:18.900 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.064) 0:00:18.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:23:05 +0000 (0:00:00.025) 0:00:18.990 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:23:06 +0000 (0:00:00.896) 0:00:19.887 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "fs_type": "ext4", "mount_point": "/opt/test2", "name": "test2", "size": "40%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:23:06 +0000 (0:00:00.037) 0:00:19.924 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:23:06 +0000 (0:00:00.031) 0:00:19.956 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "e2fsprogs", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:23:07 +0000 (0:00:01.050) 0:00:21.007 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:23:08 +0000 (0:00:00.058) 0:00:21.065 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:23:08 +0000 (0:00:00.027) 0:00:21.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:23:08 +0000 (0:00:00.028) 0:00:21.121 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:23:08 +0000 (0:00:00.026) 0:00:21.148 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:23:09 +0000 (0:00:00.881) 0:00:22.029 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:23:10 +0000 (0:00:01.701) 0:00:23.731 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:23:10 +0000 (0:00:00.045) 0:00:23.776 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:23:10 +0000 (0:00:00.036) 0:00:23.812 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:23:12 +0000 (0:00:02.092) 0:00:25.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:23:12 +0000 (0:00:00.034) 0:00:25.940 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:23:12 +0000 (0:00:00.034) 0:00:25.974 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:23:13 +0000 (0:00:00.048) 0:00:26.022 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:23:13 +0000 (0:00:00.042) 0:00:26.064 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:23:13 +0000 (0:00:00.035) 0:00:26.100 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:23:13 +0000 (0:00:00.030) 0:00:26.131 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:23:14 +0000 (0:00:01.006) 0:00:27.137 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:23:15 +0000 (0:00:01.012) 0:00:28.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:23:15 +0000 (0:00:00.698) 0:00:28.848 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:23:16 +0000 (0:00:00.395) 0:00:29.243 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:23:16 +0000 (0:00:00.030) 0:00:29.274 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:79 Wednesday 01 June 2022 17:23:17 +0000 (0:00:00.925) 0:00:30.200 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:23:17 +0000 (0:00:00.052) 0:00:30.252 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:23:17 +0000 (0:00:00.041) 0:00:30.293 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:23:17 +0000 (0:00:00.031) 0:00:30.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:23:17 +0000 (0:00:00.550) 0:00:30.875 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002872", "end": "2022-06-01 13:23:17.721943", "rc": 0, "start": "2022-06-01 13:23:17.719071" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:23:18 +0000 (0:00:00.544) 0:00:31.420 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002968", "end": "2022-06-01 13:23:18.112447", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:23:18.109479" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:23:18 +0000 (0:00:00.383) 0:00:31.803 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:23:18 +0000 (0:00:00.070) 0:00:31.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:23:18 +0000 (0:00:00.032) 0:00:31.906 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:23:18 +0000 (0:00:00.066) 0:00:31.973 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:23:18 +0000 (0:00:00.041) 0:00:32.014 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.569) 0:00:32.584 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.043) 0:00:32.627 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.041) 0:00:32.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.039) 0:00:32.709 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.038) 0:00:32.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.102) 0:00:32.850 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.047) 0:00:32.898 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.060) 0:00:32.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:23:19 +0000 (0:00:00.033) 0:00:32.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.033) 0:00:33.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.033) 0:00:33.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.031) 0:00:33.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.031) 0:00:33.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.033) 0:00:33.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.031) 0:00:33.188 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.033) 0:00:33.222 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.061) 0:00:33.283 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.078) 0:00:33.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.031) 0:00:33.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.032) 0:00:33.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.032) 0:00:33.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.033) 0:00:33.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.032) 0:00:33.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.035) 0:00:33.560 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.065) 0:00:33.626 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.039) 0:00:33.666 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.037) 0:00:33.703 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.062) 0:00:33.766 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.036) 0:00:33.803 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.034) 0:00:33.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.036) 0:00:33.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.031) 0:00:33.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.030) 0:00:33.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.031) 0:00:33.968 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:23:20 +0000 (0:00:00.030) 0:00:33.998 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.068) 0:00:34.066 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.084) 0:00:34.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.034) 0:00:34.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.035) 0:00:34.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.034) 0:00:34.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.035) 0:00:34.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.034) 0:00:34.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.033) 0:00:34.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.034) 0:00:34.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.034) 0:00:34.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.030) 0:00:34.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.033) 0:00:34.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.033) 0:00:34.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.032) 0:00:34.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.032) 0:00:34.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.039) 0:00:34.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.036) 0:00:34.666 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.033) 0:00:34.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.033) 0:00:34.733 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.118) 0:00:34.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.035) 0:00:34.888 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:23:21 +0000 (0:00:00.126) 0:00:35.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.038) 0:00:35.052 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.047) 0:00:35.099 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.044) 0:00:35.144 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.036) 0:00:35.181 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.040) 0:00:35.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.031) 0:00:35.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.031) 0:00:35.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.032) 0:00:35.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.030) 0:00:35.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.046) 0:00:35.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.034) 0:00:35.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.035) 0:00:35.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.031) 0:00:35.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.036) 0:00:35.533 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.042) 0:00:35.576 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:23:22 +0000 (0:00:00.046) 0:00:35.622 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104192.1391215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104192.1391215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18468, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104192.1391215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.438) 0:00:36.060 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.040) 0:00:36.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.039) 0:00:36.141 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.038) 0:00:36.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.033) 0:00:36.213 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.038) 0:00:36.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.033) 0:00:36.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.032) 0:00:36.317 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.031) 0:00:36.349 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.042) 0:00:36.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.033) 0:00:36.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.032) 0:00:36.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.032) 0:00:36.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.032) 0:00:36.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.032) 0:00:36.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.042) 0:00:36.598 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.038) 0:00:36.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.031) 0:00:36.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.030) 0:00:36.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.030) 0:00:36.730 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.033) 0:00:36.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.035) 0:00:36.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.034) 0:00:36.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.031) 0:00:36.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.032) 0:00:36.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.032) 0:00:36.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.030) 0:00:36.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:23:23 +0000 (0:00:00.033) 0:00:36.994 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:23:24 +0000 (0:00:00.553) 0:00:37.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:23:24 +0000 (0:00:00.033) 0:00:37.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:23:24 +0000 (0:00:00.035) 0:00:37.617 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:23:24 +0000 (0:00:00.043) 0:00:37.660 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:23:24 +0000 (0:00:00.047) 0:00:37.708 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:23:24 +0000 (0:00:00.051) 0:00:37.759 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.408) 0:00:38.167 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.040) 0:00:38.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "6442450944.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.039) 0:00:38.247 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.037) 0:00:38.284 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.035) 0:00:38.320 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.045) 0:00:38.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.050664", "end": "2022-06-01 13:23:25.124346", "rc": 0, "start": "2022-06-01 13:23:25.073682" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.452) 0:00:38.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.050) 0:00:38.868 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.042) 0:00:38.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.034) 0:00:38.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.034) 0:00:38.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:23:25 +0000 (0:00:00.034) 0:00:39.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.036) 0:00:39.050 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.033) 0:00:39.084 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.039) 0:00:39.123 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.136) 0:00:39.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.036) 0:00:39.296 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.045) 0:00:39.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.038) 0:00:39.381 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.036) 0:00:39.418 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.042) 0:00:39.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.032) 0:00:39.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.031) 0:00:39.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.033) 0:00:39.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.034) 0:00:39.594 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.051) 0:00:39.645 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.037) 0:00:39.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.040) 0:00:39.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.032) 0:00:39.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.038) 0:00:39.793 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.042) 0:00:39.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:23:26 +0000 (0:00:00.041) 0:00:39.877 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104191.8951216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104191.8951216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18434, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104191.8951216, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.416) 0:00:40.294 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.046) 0:00:40.340 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.041) 0:00:40.382 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.041) 0:00:40.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.081) 0:00:40.505 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.038) 0:00:40.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.032) 0:00:40.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.032) 0:00:40.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.032) 0:00:40.641 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.041) 0:00:40.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.035) 0:00:40.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.031) 0:00:40.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.033) 0:00:40.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.033) 0:00:40.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.032) 0:00:40.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.040) 0:00:40.890 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.051) 0:00:40.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.035) 0:00:40.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:23:27 +0000 (0:00:00.031) 0:00:41.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.029) 0:00:41.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.030) 0:00:41.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.033) 0:00:41.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.036) 0:00:41.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.031) 0:00:41.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.033) 0:00:41.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.031) 0:00:41.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.032) 0:00:41.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.031) 0:00:41.299 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.422) 0:00:41.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.034) 0:00:41.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.033) 0:00:41.789 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.036) 0:00:41.826 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.048) 0:00:41.875 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:23:28 +0000 (0:00:00.046) 0:00:41.921 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.388) 0:00:42.310 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.037) 0:00:42.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.038) 0:00:42.387 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.037) 0:00:42.424 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.034) 0:00:42.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.043) 0:00:42.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.035687", "end": "2022-06-01 13:23:29.242907", "rc": 0, "start": "2022-06-01 13:23:29.207220" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.435) 0:00:42.937 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:23:29 +0000 (0:00:00.041) 0:00:42.979 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.042) 0:00:43.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.035) 0:00:43.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.036) 0:00:43.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.035) 0:00:43.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.035) 0:00:43.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.035) 0:00:43.200 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.030) 0:00:43.230 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.029) 0:00:43.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:81 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.033) 0:00:43.293 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.065) 0:00:43.359 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.099) 0:00:43.459 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:23:30 +0000 (0:00:00.547) 0:00:44.006 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:23:31 +0000 (0:00:00.076) 0:00:44.083 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:23:31 +0000 (0:00:00.033) 0:00:44.116 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:23:31 +0000 (0:00:00.038) 0:00:44.155 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:23:31 +0000 (0:00:00.067) 0:00:44.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:23:31 +0000 (0:00:00.029) 0:00:44.252 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:23:32 +0000 (0:00:00.924) 0:00:45.177 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "mount_point": "/opt/test2", "name": "test2", "size": "40%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:23:32 +0000 (0:00:00.042) 0:00:45.219 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:23:32 +0000 (0:00:00.037) 0:00:45.256 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:23:33 +0000 (0:00:01.400) 0:00:46.657 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:23:33 +0000 (0:00:00.056) 0:00:46.714 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:23:33 +0000 (0:00:00.030) 0:00:46.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:23:33 +0000 (0:00:00.031) 0:00:46.776 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:23:33 +0000 (0:00:00.029) 0:00:46.806 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:23:34 +0000 (0:00:00.908) 0:00:47.715 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:23:36 +0000 (0:00:01.743) 0:00:49.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:23:36 +0000 (0:00:00.050) 0:00:49.509 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:23:36 +0000 (0:00:00.031) 0:00:49.541 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:23:38 +0000 (0:00:01.539) 0:00:51.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:23:38 +0000 (0:00:00.031) 0:00:51.111 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:23:38 +0000 (0:00:00.029) 0:00:51.141 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:23:38 +0000 (0:00:00.044) 0:00:51.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:23:38 +0000 (0:00:00.043) 0:00:51.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:23:38 +0000 (0:00:00.038) 0:00:51.268 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:23:38 +0000 (0:00:00.032) 0:00:51.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:23:38 +0000 (0:00:00.672) 0:00:51.973 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:23:39 +0000 (0:00:00.777) 0:00:52.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:23:40 +0000 (0:00:00.668) 0:00:53.419 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:23:40 +0000 (0:00:00.413) 0:00:53.832 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:23:40 +0000 (0:00:00.029) 0:00:53.862 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:95 Wednesday 01 June 2022 17:23:41 +0000 (0:00:00.883) 0:00:54.746 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:23:41 +0000 (0:00:00.056) 0:00:54.802 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:23:41 +0000 (0:00:00.046) 0:00:54.848 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:23:41 +0000 (0:00:00.034) 0:00:54.883 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:23:42 +0000 (0:00:00.405) 0:00:55.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003643", "end": "2022-06-01 13:23:42.016709", "rc": 0, "start": "2022-06-01 13:23:42.013066" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:23:42 +0000 (0:00:00.422) 0:00:55.710 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003597", "end": "2022-06-01 13:23:42.431050", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:23:42.427453" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.417) 0:00:56.128 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.077) 0:00:56.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.035) 0:00:56.241 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.112) 0:00:56.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.069) 0:00:56.423 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.490) 0:00:56.914 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.044) 0:00:56.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:23:43 +0000 (0:00:00.040) 0:00:56.999 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.037) 0:00:57.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.039) 0:00:57.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.034) 0:00:57.111 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.047) 0:00:57.159 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.063) 0:00:57.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.032) 0:00:57.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.031) 0:00:57.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.037) 0:00:57.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.033) 0:00:57.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.032) 0:00:57.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.036) 0:00:57.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.034) 0:00:57.462 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.033) 0:00:57.495 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.060) 0:00:57.556 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.077) 0:00:57.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.033) 0:00:57.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.039) 0:00:57.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.036) 0:00:57.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.034) 0:00:57.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.038) 0:00:57.815 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.032) 0:00:57.847 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.064) 0:00:57.911 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.039) 0:00:57.950 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:23:44 +0000 (0:00:00.040) 0:00:57.990 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.061) 0:00:58.052 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.039) 0:00:58.091 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.038) 0:00:58.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.032) 0:00:58.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:58.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:58.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.035) 0:00:58.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:58.301 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.069) 0:00:58.371 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.084) 0:00:58.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:58.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:58.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.032) 0:00:58.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.033) 0:00:58.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.033) 0:00:58.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:58.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.033) 0:00:58.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.036) 0:00:58.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:58.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.031) 0:00:58.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.031) 0:00:58.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.032) 0:00:58.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.032) 0:00:58.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.086) 0:00:58.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:23:45 +0000 (0:00:00.034) 0:00:59.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.033) 0:00:59.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.034) 0:00:59.081 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.078) 0:00:59.160 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.040) 0:00:59.201 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.132) 0:00:59.333 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.048) 0:00:59.382 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.045) 0:00:59.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.040) 0:00:59.467 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.039) 0:00:59.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.038) 0:00:59.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.032) 0:00:59.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.032) 0:00:59.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.032) 0:00:59.643 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.034) 0:00:59.678 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.052) 0:00:59.730 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.037) 0:00:59.767 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.038) 0:00:59.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.030) 0:00:59.836 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.031) 0:00:59.868 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.038) 0:00:59.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:23:46 +0000 (0:00:00.043) 0:00:59.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104192.1391215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104192.1391215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18468, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104192.1391215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.402) 0:01:00.352 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.037) 0:01:00.390 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.040) 0:01:00.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.034) 0:01:00.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.030) 0:01:00.495 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.039) 0:01:00.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.030) 0:01:00.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.029) 0:01:00.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.030) 0:01:00.625 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.037) 0:01:00.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.030) 0:01:00.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.032) 0:01:00.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.030) 0:01:00.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.030) 0:01:00.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.034) 0:01:00.821 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.038) 0:01:00.860 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.038) 0:01:00.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.038) 0:01:00.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.035) 0:01:00.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:23:47 +0000 (0:00:00.032) 0:01:01.005 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.031) 0:01:01.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.030) 0:01:01.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.030) 0:01:01.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.032) 0:01:01.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.030) 0:01:01.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.030) 0:01:01.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.033) 0:01:01.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.032) 0:01:01.258 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.380) 0:01:01.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.032) 0:01:01.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.034) 0:01:01.705 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.035) 0:01:01.740 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.051) 0:01:01.792 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:23:48 +0000 (0:00:00.046) 0:01:01.838 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.396) 0:01:02.235 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.040) 0:01:02.275 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "6442450944.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.049) 0:01:02.325 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.035) 0:01:02.360 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.037) 0:01:02.397 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.043) 0:01:02.440 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038690", "end": "2022-06-01 13:23:49.177658", "rc": 0, "start": "2022-06-01 13:23:49.138968" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.425) 0:01:02.866 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.037) 0:01:02.904 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.040) 0:01:02.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.035) 0:01:02.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:23:49 +0000 (0:00:00.037) 0:01:03.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.034) 0:01:03.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.034) 0:01:03.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.033) 0:01:03.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.036) 0:01:03.157 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.127) 0:01:03.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.041) 0:01:03.326 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.053) 0:01:03.380 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.038) 0:01:03.418 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.035) 0:01:03.454 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.040) 0:01:03.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.032) 0:01:03.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.034) 0:01:03.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.033) 0:01:03.594 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.033) 0:01:03.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.047) 0:01:03.675 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.036) 0:01:03.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.040) 0:01:03.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.032) 0:01:03.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.032) 0:01:03.818 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.038) 0:01:03.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:23:50 +0000 (0:00:00.038) 0:01:03.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104217.3381214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104217.3381214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18434, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104217.3381214, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.400) 0:01:04.295 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.038) 0:01:04.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.038) 0:01:04.372 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.122) 0:01:04.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.035) 0:01:04.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.039) 0:01:04.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.032) 0:01:04.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.032) 0:01:04.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.032) 0:01:04.667 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.041) 0:01:04.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.032) 0:01:04.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.032) 0:01:04.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.033) 0:01:04.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.032) 0:01:04.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.032) 0:01:04.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.042) 0:01:04.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.037) 0:01:04.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.033) 0:01:04.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:23:51 +0000 (0:00:00.030) 0:01:05.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.032) 0:01:05.050 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.049) 0:01:05.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.036) 0:01:05.136 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.047) 0:01:05.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.044) 0:01:05.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.030) 0:01:05.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.030) 0:01:05.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.033) 0:01:05.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.032) 0:01:05.355 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.397) 0:01:05.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.035) 0:01:05.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.033) 0:01:05.821 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.036) 0:01:05.858 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.048) 0:01:05.907 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:23:52 +0000 (0:00:00.055) 0:01:05.962 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:23:53 +0000 (0:00:00.406) 0:01:06.368 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:23:53 +0000 (0:00:00.041) 0:01:06.409 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:23:53 +0000 (0:00:00.040) 0:01:06.449 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:23:53 +0000 (0:00:00.037) 0:01:06.486 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:23:53 +0000 (0:00:00.036) 0:01:06.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:23:53 +0000 (0:00:00.041) 0:01:06.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.034493", "end": "2022-06-01 13:23:53.279914", "rc": 0, "start": "2022-06-01 13:23:53.245421" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:23:53 +0000 (0:00:00.410) 0:01:06.974 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.048) 0:01:07.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.042) 0:01:07.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.035) 0:01:07.102 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.036) 0:01:07.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.036) 0:01:07.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.036) 0:01:07.211 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.034) 0:01:07.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.031) 0:01:07.278 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.032) 0:01:07.310 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Shrink test2 volume via percentage-based size spec] ********************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:97 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.034) 0:01:07.344 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.123) 0:01:07.468 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:23:54 +0000 (0:00:00.047) 0:01:07.515 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:23:55 +0000 (0:00:00.561) 0:01:08.076 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:23:55 +0000 (0:00:00.072) 0:01:08.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:23:55 +0000 (0:00:00.033) 0:01:08.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:23:55 +0000 (0:00:00.035) 0:01:08.217 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:23:55 +0000 (0:00:00.068) 0:01:08.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:23:55 +0000 (0:00:00.030) 0:01:08.316 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:23:56 +0000 (0:00:01.005) 0:01:09.322 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "mount_point": "/opt/test2", "name": "test2", "size": "25%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:23:56 +0000 (0:00:00.040) 0:01:09.362 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:23:56 +0000 (0:00:00.035) 0:01:09.398 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:23:57 +0000 (0:00:01.355) 0:01:10.753 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:23:57 +0000 (0:00:00.060) 0:01:10.814 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:23:57 +0000 (0:00:00.030) 0:01:10.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:23:57 +0000 (0:00:00.033) 0:01:10.878 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:23:57 +0000 (0:00:00.033) 0:01:10.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:23:58 +0000 (0:00:00.918) 0:01:11.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:24:00 +0000 (0:00:01.767) 0:01:13.598 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:24:00 +0000 (0:00:00.065) 0:01:13.663 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:24:00 +0000 (0:00:00.038) 0:01:13.702 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:24:02 +0000 (0:00:01.979) 0:01:15.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:24:02 +0000 (0:00:00.033) 0:01:15.716 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:24:02 +0000 (0:00:00.031) 0:01:15.748 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:24:02 +0000 (0:00:00.047) 0:01:15.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:24:02 +0000 (0:00:00.045) 0:01:15.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:24:02 +0000 (0:00:00.036) 0:01:15.878 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:24:02 +0000 (0:00:00.035) 0:01:15.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:24:03 +0000 (0:00:00.696) 0:01:16.611 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:24:04 +0000 (0:00:00.803) 0:01:17.414 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:24:05 +0000 (0:00:00.692) 0:01:18.106 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:24:05 +0000 (0:00:00.415) 0:01:18.521 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:24:05 +0000 (0:00:00.033) 0:01:18.554 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:112 Wednesday 01 June 2022 17:24:06 +0000 (0:00:00.896) 0:01:19.451 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:24:06 +0000 (0:00:00.062) 0:01:19.514 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:24:06 +0000 (0:00:00.048) 0:01:19.562 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:24:06 +0000 (0:00:00.033) 0:01:19.596 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:24:06 +0000 (0:00:00.400) 0:01:19.996 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002766", "end": "2022-06-01 13:24:06.685888", "rc": 0, "start": "2022-06-01 13:24:06.683122" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:24:07 +0000 (0:00:00.378) 0:01:20.375 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003005", "end": "2022-06-01 13:24:07.074160", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:24:07.071155" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:24:07 +0000 (0:00:00.401) 0:01:20.776 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:24:07 +0000 (0:00:00.071) 0:01:20.848 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:24:07 +0000 (0:00:00.033) 0:01:20.881 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:24:07 +0000 (0:00:00.082) 0:01:20.963 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:24:07 +0000 (0:00:00.042) 0:01:21.006 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.413) 0:01:21.420 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.047) 0:01:21.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.046) 0:01:21.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.036) 0:01:21.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.036) 0:01:21.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.032) 0:01:21.622 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.045) 0:01:21.667 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.060) 0:01:21.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.031) 0:01:21.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.031) 0:01:21.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.032) 0:01:21.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.032) 0:01:21.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.035) 0:01:21.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.035) 0:01:21.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.034) 0:01:21.961 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:24:08 +0000 (0:00:00.034) 0:01:21.996 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.062) 0:01:22.058 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.076) 0:01:22.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.032) 0:01:22.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.033) 0:01:22.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.035) 0:01:22.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.034) 0:01:22.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.034) 0:01:22.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.032) 0:01:22.338 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.065) 0:01:22.404 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.043) 0:01:22.447 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.036) 0:01:22.484 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.057) 0:01:22.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.038) 0:01:22.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.039) 0:01:22.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.030) 0:01:22.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.029) 0:01:22.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.030) 0:01:22.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.033) 0:01:22.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.034) 0:01:22.778 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.125) 0:01:22.904 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:24:09 +0000 (0:00:00.085) 0:01:22.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.033) 0:01:23.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.035) 0:01:23.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.034) 0:01:23.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.034) 0:01:23.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.032) 0:01:23.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.036) 0:01:23.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.037) 0:01:23.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.042) 0:01:23.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.035) 0:01:23.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.033) 0:01:23.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.031) 0:01:23.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.033) 0:01:23.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.033) 0:01:23.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.031) 0:01:23.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.032) 0:01:23.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.034) 0:01:23.544 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.034) 0:01:23.579 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.075) 0:01:23.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.038) 0:01:23.693 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.122) 0:01:23.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.039) 0:01:23.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.046) 0:01:23.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.040) 0:01:23.942 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.036) 0:01:23.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:24:10 +0000 (0:00:00.041) 0:01:24.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.032) 0:01:24.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.030) 0:01:24.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.032) 0:01:24.115 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.033) 0:01:24.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.051) 0:01:24.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.038) 0:01:24.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.039) 0:01:24.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.032) 0:01:24.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.036) 0:01:24.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.041) 0:01:24.389 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.038) 0:01:24.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104192.1391215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104192.1391215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18468, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104192.1391215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.401) 0:01:24.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.041) 0:01:24.870 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.039) 0:01:24.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.037) 0:01:24.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:24:11 +0000 (0:00:00.035) 0:01:24.982 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.038) 0:01:25.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.033) 0:01:25.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.031) 0:01:25.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.032) 0:01:25.119 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.038) 0:01:25.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.078) 0:01:25.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.032) 0:01:25.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.030) 0:01:25.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.030) 0:01:25.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.030) 0:01:25.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.037) 0:01:25.398 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.037) 0:01:25.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.032) 0:01:25.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.033) 0:01:25.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.033) 0:01:25.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.038) 0:01:25.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.037) 0:01:25.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.041) 0:01:25.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.040) 0:01:25.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.043) 0:01:25.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.035) 0:01:25.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.036) 0:01:25.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:24:12 +0000 (0:00:00.037) 0:01:25.846 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.418) 0:01:26.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.034) 0:01:26.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.033) 0:01:26.334 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.038) 0:01:26.373 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.048) 0:01:26.421 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.045) 0:01:26.467 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.393) 0:01:26.860 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.042) 0:01:26.902 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "6442450944.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.048) 0:01:26.951 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:24:13 +0000 (0:00:00.040) 0:01:26.991 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.038) 0:01:27.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.042) 0:01:27.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.042558", "end": "2022-06-01 13:24:13.836287", "rc": 0, "start": "2022-06-01 13:24:13.793729" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.469) 0:01:27.542 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.043) 0:01:27.585 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.043) 0:01:27.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.036) 0:01:27.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.036) 0:01:27.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.036) 0:01:27.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.036) 0:01:27.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.038) 0:01:27.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.039) 0:01:27.852 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.128) 0:01:27.981 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:24:14 +0000 (0:00:00.037) 0:01:28.018 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.047) 0:01:28.066 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.040) 0:01:28.107 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.092) 0:01:28.199 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.041) 0:01:28.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.035) 0:01:28.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.039) 0:01:28.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.039) 0:01:28.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.038) 0:01:28.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.056) 0:01:28.450 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.037) 0:01:28.488 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.039) 0:01:28.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.032) 0:01:28.560 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.034) 0:01:28.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.044) 0:01:28.639 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:24:15 +0000 (0:00:00.043) 0:01:28.682 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104241.8721216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104241.8721216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18434, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104241.8721216, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.405) 0:01:29.088 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.040) 0:01:29.129 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.040) 0:01:29.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.036) 0:01:29.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.034) 0:01:29.241 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.041) 0:01:29.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.033) 0:01:29.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.034) 0:01:29.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.033) 0:01:29.384 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.040) 0:01:29.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.034) 0:01:29.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.037) 0:01:29.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.035) 0:01:29.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.035) 0:01:29.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.035) 0:01:29.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.043) 0:01:29.646 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.039) 0:01:29.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.039) 0:01:29.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.034) 0:01:29.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.033) 0:01:29.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.035) 0:01:29.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.034) 0:01:29.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.033) 0:01:29.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.036) 0:01:29.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.034) 0:01:29.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:24:16 +0000 (0:00:00.034) 0:01:30.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.035) 0:01:30.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.035) 0:01:30.074 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2684354560, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.386) 0:01:30.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.034) 0:01:30.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.034) 0:01:30.530 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.038) 0:01:30.568 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.045) 0:01:30.614 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "d5ca902b-6920-427a-a3be-2a655f830bb8" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:24:17 +0000 (0:00:00.045) 0:01:30.659 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.407) 0:01:31.067 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.041) 0:01:31.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2684354560.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.039) 0:01:31.147 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2684354560, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.039) 0:01:31.187 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.037) 0:01:31.224 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.040) 0:01:31.264 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.034120", "end": "2022-06-01 13:24:18.002651", "rc": 0, "start": "2022-06-01 13:24:17.968531" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.434) 0:01:31.699 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.042) 0:01:31.742 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.043) 0:01:31.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.036) 0:01:31.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.085) 0:01:31.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.036) 0:01:31.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:24:18 +0000 (0:00:00.040) 0:01:31.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:24:19 +0000 (0:00:00.039) 0:01:32.024 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:24:19 +0000 (0:00:00.037) 0:01:32.062 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:24:19 +0000 (0:00:00.034) 0:01:32.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Get the size of test2 volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:114 Wednesday 01 June 2022 17:24:19 +0000 (0:00:00.042) 0:01:32.139 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lsblk", "--noheadings", "-o", "SIZE", "/dev/mapper/foo-test2" ], "delta": "0:00:00.005240", "end": "2022-06-01 13:24:18.852104", "rc": 0, "start": "2022-06-01 13:24:18.846864" } STDOUT: 2.5G TASK [Remove the test1 volume without changing its size] *********************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:119 Wednesday 01 June 2022 17:24:19 +0000 (0:00:00.415) 0:01:32.554 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:24:19 +0000 (0:00:00.085) 0:01:32.640 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:24:19 +0000 (0:00:00.051) 0:01:32.691 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:24:20 +0000 (0:00:00.541) 0:01:33.233 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:24:20 +0000 (0:00:00.079) 0:01:33.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:24:20 +0000 (0:00:00.034) 0:01:33.347 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:24:20 +0000 (0:00:00.036) 0:01:33.383 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:24:20 +0000 (0:00:00.067) 0:01:33.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:24:20 +0000 (0:00:00.029) 0:01:33.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:24:21 +0000 (0:00:00.945) 0:01:34.426 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%", "state": "absent" }, { "mount_point": "/opt/test2", "name": "test2", "size": "25%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:24:21 +0000 (0:00:00.043) 0:01:34.470 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:24:21 +0000 (0:00:00.036) 0:01:34.506 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:24:22 +0000 (0:00:01.370) 0:01:35.877 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:24:22 +0000 (0:00:00.067) 0:01:35.945 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:24:22 +0000 (0:00:00.032) 0:01:35.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:24:22 +0000 (0:00:00.034) 0:01:36.012 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:24:23 +0000 (0:00:00.031) 0:01:36.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:24:23 +0000 (0:00:00.849) 0:01:36.893 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:24:25 +0000 (0:00:01.810) 0:01:38.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:24:25 +0000 (0:00:00.050) 0:01:38.754 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:24:25 +0000 (0:00:00.030) 0:01:38.784 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:24:27 +0000 (0:00:01.940) 0:01:40.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:24:27 +0000 (0:00:00.031) 0:01:40.756 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:24:27 +0000 (0:00:00.029) 0:01:40.785 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:24:27 +0000 (0:00:00.042) 0:01:40.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:24:27 +0000 (0:00:00.043) 0:01:40.871 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:24:27 +0000 (0:00:00.041) 0:01:40.912 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:24:28 +0000 (0:00:00.407) 0:01:41.320 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:24:28 +0000 (0:00:00.691) 0:01:42.011 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:24:29 +0000 (0:00:00.404) 0:01:42.415 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:24:30 +0000 (0:00:00.676) 0:01:43.092 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:24:30 +0000 (0:00:00.390) 0:01:43.482 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:24:30 +0000 (0:00:00.031) 0:01:43.514 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:136 Wednesday 01 June 2022 17:24:31 +0000 (0:00:00.895) 0:01:44.409 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:24:31 +0000 (0:00:00.108) 0:01:44.518 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:24:31 +0000 (0:00:00.045) 0:01:44.563 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:24:31 +0000 (0:00:00.032) 0:01:44.596 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:24:31 +0000 (0:00:00.421) 0:01:45.017 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003036", "end": "2022-06-01 13:24:31.729895", "rc": 0, "start": "2022-06-01 13:24:31.726859" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:24:32 +0000 (0:00:00.404) 0:01:45.422 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003016", "end": "2022-06-01 13:24:32.124058", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:24:32.121042" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:24:32 +0000 (0:00:00.394) 0:01:45.816 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:24:32 +0000 (0:00:00.075) 0:01:45.892 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:24:32 +0000 (0:00:00.036) 0:01:45.929 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:24:32 +0000 (0:00:00.074) 0:01:46.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.044) 0:01:46.048 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.403) 0:01:46.451 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.045) 0:01:46.497 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.039) 0:01:46.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.039) 0:01:46.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.039) 0:01:46.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.042) 0:01:46.658 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.059) 0:01:46.717 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.059) 0:01:46.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.033) 0:01:46.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.033) 0:01:46.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.035) 0:01:46.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.034) 0:01:46.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.034) 0:01:46.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:24:33 +0000 (0:00:00.035) 0:01:46.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.034) 0:01:47.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.035) 0:01:47.056 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.063) 0:01:47.120 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.077) 0:01:47.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.034) 0:01:47.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.032) 0:01:47.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.031) 0:01:47.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.032) 0:01:47.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.032) 0:01:47.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.032) 0:01:47.394 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.118) 0:01:47.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.038) 0:01:47.551 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.038) 0:01:47.589 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.062) 0:01:47.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.039) 0:01:47.691 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.039) 0:01:47.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.031) 0:01:47.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.033) 0:01:47.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.031) 0:01:47.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.037) 0:01:47.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.034) 0:01:47.898 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:24:34 +0000 (0:00:00.065) 0:01:47.964 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.087) 0:01:48.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.034) 0:01:48.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.032) 0:01:48.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.032) 0:01:48.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.035) 0:01:48.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.034) 0:01:48.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.288 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.034) 0:01:48.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.035) 0:01:48.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.032) 0:01:48.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.036) 0:01:48.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.629 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.073) 0:01:48.702 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.039) 0:01:48.742 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.130) 0:01:48.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.039) 0:01:48.913 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.043) 0:01:48.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:24:35 +0000 (0:00:00.033) 0:01:48.990 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.039) 0:01:49.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.033) 0:01:49.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.033) 0:01:49.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.030) 0:01:49.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.033) 0:01:49.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.033) 0:01:49.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.049) 0:01:49.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.027) 0:01:49.271 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.043) 0:01:49.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.033) 0:01:49.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.034) 0:01:49.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.032) 0:01:49.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.028) 0:01:49.444 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.407) 0:01:49.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.040) 0:01:49.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.028) 0:01:49.921 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.037) 0:01:49.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:24:36 +0000 (0:00:00.033) 0:01:49.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.028) 0:01:50.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.032) 0:01:50.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.032) 0:01:50.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.032) 0:01:50.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.031) 0:01:50.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.033) 0:01:50.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.031) 0:01:50.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.034) 0:01:50.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.031) 0:01:50.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.032) 0:01:50.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.044) 0:01:50.357 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.040) 0:01:50.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.033) 0:01:50.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.032) 0:01:50.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.033) 0:01:50.497 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.032) 0:01:50.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.036) 0:01:50.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.039) 0:01:50.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.033) 0:01:50.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.033) 0:01:50.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.034) 0:01:50.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.032) 0:01:50.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.033) 0:01:50.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.030) 0:01:50.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.029) 0:01:50.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.033) 0:01:50.867 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.036) 0:01:50.904 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.037) 0:01:50.941 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.038) 0:01:50.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:24:37 +0000 (0:00:00.031) 0:01:51.012 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.033) 0:01:51.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.034) 0:01:51.080 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.038) 0:01:51.119 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.037) 0:01:51.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.033) 0:01:51.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.030) 0:01:51.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.031) 0:01:51.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.034) 0:01:51.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.033) 0:01:51.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.032) 0:01:51.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.036) 0:01:51.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.033) 0:01:51.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.034) 0:01:51.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.036) 0:01:51.493 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.125) 0:01:51.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.038) 0:01:51.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.047) 0:01:51.705 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.042) 0:01:51.747 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.039) 0:01:51.786 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.038) 0:01:51.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.042) 0:01:51.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.035) 0:01:51.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:24:38 +0000 (0:00:00.102) 0:01:52.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.043) 0:01:52.049 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.050) 0:01:52.099 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.047) 0:01:52.147 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.040) 0:01:52.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.033) 0:01:52.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.040) 0:01:52.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.041) 0:01:52.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.049) 0:01:52.352 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104241.8721216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104241.8721216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18434, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104241.8721216, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.411) 0:01:52.764 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.040) 0:01:52.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.038) 0:01:52.842 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.035) 0:01:52.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.036) 0:01:52.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.043) 0:01:52.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:24:39 +0000 (0:00:00.031) 0:01:52.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.031) 0:01:53.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.032) 0:01:53.053 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.039) 0:01:53.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.034) 0:01:53.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.033) 0:01:53.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.032) 0:01:53.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.030) 0:01:53.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.030) 0:01:53.255 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.040) 0:01:53.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.042) 0:01:53.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.033) 0:01:53.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.033) 0:01:53.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.032) 0:01:53.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.032) 0:01:53.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.031) 0:01:53.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.035) 0:01:53.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.033) 0:01:53.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.033) 0:01:53.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.032) 0:01:53.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.031) 0:01:53.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:24:40 +0000 (0:00:00.032) 0:01:53.701 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2684354560, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.381) 0:01:54.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.037) 0:01:54.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.032) 0:01:54.153 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.033) 0:01:54.187 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.043) 0:01:54.230 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.043) 0:01:54.274 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.395) 0:01:54.669 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.040) 0:01:54.709 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2684354560.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.041) 0:01:54.751 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2684354560, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.038) 0:01:54.789 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.036) 0:01:54.825 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:24:41 +0000 (0:00:00.039) 0:01:54.864 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.030856", "end": "2022-06-01 13:24:41.592943", "rc": 0, "start": "2022-06-01 13:24:41.562087" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.420) 0:01:55.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.045) 0:01:55.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.047) 0:01:55.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.038) 0:01:55.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.034) 0:01:55.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.036) 0:01:55.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.035) 0:01:55.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.035) 0:01:55.558 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.041) 0:01:55.599 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.080) 0:01:55.680 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Get the size of test2 volume again] ************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:138 Wednesday 01 June 2022 17:24:42 +0000 (0:00:00.038) 0:01:55.718 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lsblk", "--noheadings", "-o", "SIZE", "/dev/mapper/foo-test2" ], "delta": "0:00:00.004114", "end": "2022-06-01 13:24:42.425749", "rc": 0, "start": "2022-06-01 13:24:42.421635" } STDOUT: 2.5G TASK [Verify that removing test1 didn't cause a change in test2 size] ********** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:143 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.399) 0:01:56.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Grow test2 using a percentage-based size spec] *************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:147 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.038) 0:01:56.156 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.089) 0:01:56.245 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.047) 0:01:56.293 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.550) 0:01:56.843 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.075) 0:01:56.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.032) 0:01:56.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:24:43 +0000 (0:00:00.032) 0:01:56.983 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:24:44 +0000 (0:00:00.078) 0:01:57.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:24:44 +0000 (0:00:00.029) 0:01:57.091 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:24:44 +0000 (0:00:00.923) 0:01:58.015 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test2", "name": "test2", "size": "50%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:24:45 +0000 (0:00:00.038) 0:01:58.054 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:24:45 +0000 (0:00:00.035) 0:01:58.089 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:24:46 +0000 (0:00:01.271) 0:01:59.360 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:24:46 +0000 (0:00:00.059) 0:01:59.420 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:24:46 +0000 (0:00:00.029) 0:01:59.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:24:46 +0000 (0:00:00.030) 0:01:59.480 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:24:46 +0000 (0:00:00.029) 0:01:59.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:24:47 +0000 (0:00:00.892) 0:02:00.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:24:49 +0000 (0:00:01.917) 0:02:02.319 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:24:49 +0000 (0:00:00.048) 0:02:02.368 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:24:49 +0000 (0:00:00.030) 0:02:02.398 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:24:51 +0000 (0:00:01.824) 0:02:04.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:24:51 +0000 (0:00:00.033) 0:02:04.257 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:24:51 +0000 (0:00:00.031) 0:02:04.288 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:24:51 +0000 (0:00:00.042) 0:02:04.330 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:24:51 +0000 (0:00:00.040) 0:02:04.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:24:51 +0000 (0:00:00.034) 0:02:04.405 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:24:51 +0000 (0:00:00.028) 0:02:04.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:24:52 +0000 (0:00:00.687) 0:02:05.122 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:24:52 +0000 (0:00:00.444) 0:02:05.566 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:24:53 +0000 (0:00:00.699) 0:02:06.266 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:24:53 +0000 (0:00:00.397) 0:02:06.663 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:24:53 +0000 (0:00:00.032) 0:02:06.695 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:160 Wednesday 01 June 2022 17:24:54 +0000 (0:00:00.902) 0:02:07.598 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:24:54 +0000 (0:00:00.078) 0:02:07.677 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:24:54 +0000 (0:00:00.044) 0:02:07.722 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:24:54 +0000 (0:00:00.032) 0:02:07.754 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "5G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:24:55 +0000 (0:00:00.428) 0:02:08.182 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002982", "end": "2022-06-01 13:24:54.888651", "rc": 0, "start": "2022-06-01 13:24:54.885669" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:24:55 +0000 (0:00:00.404) 0:02:08.587 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003455", "end": "2022-06-01 13:24:55.296467", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:24:55.293012" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:24:55 +0000 (0:00:00.408) 0:02:08.996 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.140) 0:02:09.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.034) 0:02:09.171 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.065) 0:02:09.237 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.041) 0:02:09.279 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.399) 0:02:09.678 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.043) 0:02:09.721 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.038) 0:02:09.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.038) 0:02:09.798 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.036) 0:02:09.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.032) 0:02:09.868 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.045) 0:02:09.913 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.062) 0:02:09.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:24:56 +0000 (0:00:00.033) 0:02:10.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.033) 0:02:10.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.032) 0:02:10.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.034) 0:02:10.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.032) 0:02:10.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.033) 0:02:10.175 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.033) 0:02:10.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.035) 0:02:10.243 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.060) 0:02:10.303 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.062) 0:02:10.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.034) 0:02:10.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.033) 0:02:10.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.032) 0:02:10.467 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.061) 0:02:10.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.038) 0:02:10.566 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.037) 0:02:10.603 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.057) 0:02:10.661 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.037) 0:02:10.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.037) 0:02:10.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.034) 0:02:10.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.030) 0:02:10.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.032) 0:02:10.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.036) 0:02:10.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.039) 0:02:10.910 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:24:57 +0000 (0:00:00.078) 0:02:10.988 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.063) 0:02:11.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.031) 0:02:11.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.032) 0:02:11.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.032) 0:02:11.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.033) 0:02:11.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.031) 0:02:11.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.032) 0:02:11.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.032) 0:02:11.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.080) 0:02:11.360 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.031) 0:02:11.392 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.059) 0:02:11.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.036) 0:02:11.487 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.125) 0:02:11.613 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.038) 0:02:11.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1200036, "block_size": 4096, "block_total": 1273760, "block_used": 73724, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 4915347456, "size_total": 5217320960, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1200036, "block_size": 4096, "block_total": 1273760, "block_used": 73724, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 4915347456, "size_total": 5217320960, "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.046) 0:02:11.698 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.039) 0:02:11.737 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.035) 0:02:11.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.038) 0:02:11.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.034) 0:02:11.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.032) 0:02:11.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.032) 0:02:11.910 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.034) 0:02:11.945 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:24:58 +0000 (0:00:00.051) 0:02:11.997 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.039) 0:02:12.036 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.040) 0:02:12.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.031) 0:02:12.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.037) 0:02:12.146 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.039) 0:02:12.185 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.037) 0:02:12.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104290.4771216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104290.4771216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 18434, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104290.4771216, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.400) 0:02:12.622 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.039) 0:02:12.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.038) 0:02:12.701 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.036) 0:02:12.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.034) 0:02:12.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.041) 0:02:12.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.032) 0:02:12.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.031) 0:02:12.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.033) 0:02:12.910 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.038) 0:02:12.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.030) 0:02:12.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:24:59 +0000 (0:00:00.033) 0:02:13.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.032) 0:02:13.045 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.033) 0:02:13.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.034) 0:02:13.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.042) 0:02:13.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.038) 0:02:13.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.032) 0:02:13.226 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.032) 0:02:13.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.034) 0:02:13.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.035) 0:02:13.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.031) 0:02:13.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.033) 0:02:13.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.033) 0:02:13.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.032) 0:02:13.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.031) 0:02:13.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.034) 0:02:13.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.032) 0:02:13.559 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:25:00 +0000 (0:00:00.437) 0:02:13.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.034) 0:02:14.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.033) 0:02:14.064 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.036) 0:02:14.100 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.043) 0:02:14.144 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "5G", "type": "lvm", "uuid": "9532768d-cbff-4252-95dc-698dd62d6dcb" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "NLrJfq-Vt8m-ElDg-IZXo-v6vD-WZQk-mjt0t6" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.055) 0:02:14.199 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.409) 0:02:14.609 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.041) 0:02:14.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.042) 0:02:14.693 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.037) 0:02:14.731 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.035) 0:02:14.766 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:25:01 +0000 (0:00:00.043) 0:02:14.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.039585", "end": "2022-06-01 13:25:01.558621", "rc": 0, "start": "2022-06-01 13:25:01.519036" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.449) 0:02:15.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.040) 0:02:15.300 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.040) 0:02:15.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.034) 0:02:15.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.037) 0:02:15.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.035) 0:02:15.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.037) 0:02:15.486 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.033) 0:02:15.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.033) 0:02:15.554 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.034) 0:02:15.588 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove both of the LVM logical volumes in 'foo' created above] *********** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:162 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.036) 0:02:15.625 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.098) 0:02:15.724 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:25:02 +0000 (0:00:00.047) 0:02:15.771 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:25:03 +0000 (0:00:00.571) 0:02:16.342 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:25:03 +0000 (0:00:00.073) 0:02:16.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:25:03 +0000 (0:00:00.029) 0:02:16.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:25:03 +0000 (0:00:00.029) 0:02:16.475 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:25:03 +0000 (0:00:00.063) 0:02:16.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:25:03 +0000 (0:00:00.025) 0:02:16.564 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:25:04 +0000 (0:00:00.913) 0:02:17.478 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:25:04 +0000 (0:00:00.039) 0:02:17.517 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:25:04 +0000 (0:00:00.088) 0:02:17.606 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:25:05 +0000 (0:00:01.262) 0:02:18.869 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:25:05 +0000 (0:00:00.057) 0:02:18.926 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:25:05 +0000 (0:00:00.030) 0:02:18.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:25:05 +0000 (0:00:00.031) 0:02:18.988 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:25:06 +0000 (0:00:00.033) 0:02:19.021 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:25:06 +0000 (0:00:00.917) 0:02:19.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:25:08 +0000 (0:00:01.746) 0:02:21.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:25:08 +0000 (0:00:00.050) 0:02:21.736 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:25:08 +0000 (0:00:00.032) 0:02:21.769 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:25:10 +0000 (0:00:01.905) 0:02:23.675 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:25:10 +0000 (0:00:00.032) 0:02:23.708 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:25:10 +0000 (0:00:00.031) 0:02:23.739 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:25:10 +0000 (0:00:00.040) 0:02:23.780 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:25:10 +0000 (0:00:00.039) 0:02:23.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:25:10 +0000 (0:00:00.036) 0:02:23.856 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:25:11 +0000 (0:00:00.393) 0:02:24.249 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:25:11 +0000 (0:00:00.702) 0:02:24.952 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:25:11 +0000 (0:00:00.033) 0:02:24.985 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:25:12 +0000 (0:00:00.711) 0:02:25.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:25:13 +0000 (0:00:00.418) 0:02:26.116 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:25:13 +0000 (0:00:00.032) 0:02:26.148 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:171 Wednesday 01 June 2022 17:25:14 +0000 (0:00:00.967) 0:02:27.115 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:25:14 +0000 (0:00:00.075) 0:02:27.191 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:25:14 +0000 (0:00:00.040) 0:02:27.231 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:25:14 +0000 (0:00:00.031) 0:02:27.262 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:25:14 +0000 (0:00:00.411) 0:02:27.673 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003444", "end": "2022-06-01 13:25:14.369630", "rc": 0, "start": "2022-06-01 13:25:14.366186" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.390) 0:02:28.064 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003117", "end": "2022-06-01 13:25:14.763962", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:25:14.760845" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.396) 0:02:28.460 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.058) 0:02:28.519 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.033) 0:02:28.553 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.064) 0:02:28.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.041) 0:02:28.659 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.029) 0:02:28.689 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.030) 0:02:28.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.037) 0:02:28.758 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.039) 0:02:28.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.039) 0:02:28.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.032) 0:02:28.869 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.030) 0:02:28.899 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.057) 0:02:28.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:25:15 +0000 (0:00:00.034) 0:02:28.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.035) 0:02:29.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.032) 0:02:29.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.032) 0:02:29.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.031) 0:02:29.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.032) 0:02:29.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.033) 0:02:29.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.031) 0:02:29.220 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.060) 0:02:29.281 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.030) 0:02:29.312 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.122) 0:02:29.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.036) 0:02:29.472 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.030) 0:02:29.503 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.028) 0:02:29.532 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.032) 0:02:29.564 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.065) 0:02:29.630 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.030) 0:02:29.660 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.032) 0:02:29.693 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.029) 0:02:29.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.037) 0:02:29.761 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.030) 0:02:29.791 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=771 changed=10 unreachable=0 failed=1 skipped=529 rescued=1 ignored=0 Wednesday 01 June 2022 17:25:16 +0000 (0:00:00.017) 0:02:29.809 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get required packages ---------------------- 1.40s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 1.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 1.36s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 set up internal repositories -------------------------------------------- 1.31s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : get required packages ---------------------- 1.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 1.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:25:17 +0000 (0:00:00.024) 0:00:00.024 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:25:18 +0000 (0:00:01.371) 0:00:01.395 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_percent_size_nvme_generated.yml **************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_percent_size_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:25:18 +0000 (0:00:00.026) 0:00:01.421 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:25:19 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:25:21 +0000 (0:00:01.332) 0:00:01.355 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_lvm_percent_size_scsi_generated.yml **************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_lvm_percent_size_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size_scsi_generated.yml:3 Wednesday 01 June 2022 17:25:21 +0000 (0:00:00.024) 0:00:01.379 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size_scsi_generated.yml:7 Wednesday 01 June 2022 17:25:22 +0000 (0:00:01.129) 0:00:02.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:2 Wednesday 01 June 2022 17:25:22 +0000 (0:00:00.026) 0:00:02.536 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:17 Wednesday 01 June 2022 17:25:23 +0000 (0:00:00.850) 0:00:03.386 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:25:23 +0000 (0:00:00.051) 0:00:03.438 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:25:23 +0000 (0:00:00.182) 0:00:03.621 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:25:23 +0000 (0:00:00.537) 0:00:04.159 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:25:23 +0000 (0:00:00.079) 0:00:04.238 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:25:23 +0000 (0:00:00.023) 0:00:04.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:25:23 +0000 (0:00:00.026) 0:00:04.289 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:25:24 +0000 (0:00:00.197) 0:00:04.486 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:25:24 +0000 (0:00:00.019) 0:00:04.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:25:25 +0000 (0:00:01.152) 0:00:05.658 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:25:25 +0000 (0:00:00.047) 0:00:05.706 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:25:25 +0000 (0:00:00.047) 0:00:05.753 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:25:26 +0000 (0:00:00.706) 0:00:06.459 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:25:26 +0000 (0:00:00.082) 0:00:06.542 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:25:26 +0000 (0:00:00.021) 0:00:06.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:25:26 +0000 (0:00:00.023) 0:00:06.588 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:25:26 +0000 (0:00:00.021) 0:00:06.610 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:25:27 +0000 (0:00:00.855) 0:00:07.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:25:29 +0000 (0:00:01.947) 0:00:09.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.042) 0:00:09.455 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.061) 0:00:09.517 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.538) 0:00:10.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.029) 0:00:10.085 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.026) 0:00:10.112 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.034) 0:00:10.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.035) 0:00:10.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.033) 0:00:10.216 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.030) 0:00:10.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:25:29 +0000 (0:00:00.030) 0:00:10.277 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:25:30 +0000 (0:00:00.028) 0:00:10.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:25:30 +0000 (0:00:00.029) 0:00:10.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:25:30 +0000 (0:00:00.494) 0:00:10.830 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:25:30 +0000 (0:00:00.032) 0:00:10.862 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:20 Wednesday 01 June 2022 17:25:31 +0000 (0:00:00.888) 0:00:11.751 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:25:31 +0000 (0:00:00.045) 0:00:11.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.537) 0:00:12.334 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.036) 0:00:12.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.034) 0:00:12.405 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Try to create LVM with an invalid size specification.] ******************* task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:27 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.034) 0:00:12.439 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.050) 0:00:12.490 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.045) 0:00:12.535 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.542) 0:00:13.078 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.085) 0:00:13.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.034) 0:00:13.198 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:25:32 +0000 (0:00:00.035) 0:00:13.234 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:25:33 +0000 (0:00:00.065) 0:00:13.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:25:33 +0000 (0:00:00.027) 0:00:13.326 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:25:33 +0000 (0:00:00.901) 0:00:14.227 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "2x%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:25:33 +0000 (0:00:00.038) 0:00:14.266 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:25:34 +0000 (0:00:00.034) 0:00:14.300 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:25:35 +0000 (0:00:01.051) 0:00:15.351 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:25:35 +0000 (0:00:00.055) 0:00:15.407 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:25:35 +0000 (0:00:00.042) 0:00:15.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:25:35 +0000 (0:00:00.035) 0:00:15.484 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:25:35 +0000 (0:00:00.030) 0:00:15.514 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:25:36 +0000 (0:00:00.873) 0:00:16.388 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:25:37 +0000 (0:00:01.812) 0:00:18.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:25:37 +0000 (0:00:00.054) 0:00:18.255 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:25:37 +0000 (0:00:00.029) 0:00:18.285 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: invalid percentage '2x%' size specified for volume 'test1' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:25:39 +0000 (0:00:01.068) 0:00:19.354 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'2x%', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"invalid percentage '2x%' size specified for volume 'test1'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.043) 0:00:19.398 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:44 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.028) 0:00:19.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check for the expected error message] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:50 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.036) 0:00:19.463 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create two LVM logical volumes under volume group 'foo' using percentage sizes] *** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:63 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.033) 0:00:19.497 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.054) 0:00:19.552 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.043) 0:00:19.596 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.532) 0:00:20.129 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.074) 0:00:20.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.031) 0:00:20.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:25:39 +0000 (0:00:00.033) 0:00:20.268 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:25:40 +0000 (0:00:00.066) 0:00:20.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:25:40 +0000 (0:00:00.068) 0:00:20.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:25:40 +0000 (0:00:00.854) 0:00:21.258 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "fs_type": "ext4", "mount_point": "/opt/test2", "name": "test2", "size": "40%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:25:40 +0000 (0:00:00.040) 0:00:21.298 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:25:41 +0000 (0:00:00.035) 0:00:21.334 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "e2fsprogs", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:25:42 +0000 (0:00:01.083) 0:00:22.417 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:25:42 +0000 (0:00:00.063) 0:00:22.480 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:25:42 +0000 (0:00:00.034) 0:00:22.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:25:42 +0000 (0:00:00.033) 0:00:22.548 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:25:42 +0000 (0:00:00.030) 0:00:22.579 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:25:43 +0000 (0:00:00.884) 0:00:23.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:25:45 +0000 (0:00:01.983) 0:00:25.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:25:45 +0000 (0:00:00.050) 0:00:25.498 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:25:45 +0000 (0:00:00.030) 0:00:25.528 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:25:47 +0000 (0:00:02.065) 0:00:27.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:25:47 +0000 (0:00:00.030) 0:00:27.623 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:25:47 +0000 (0:00:00.029) 0:00:27.653 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:25:47 +0000 (0:00:00.045) 0:00:27.698 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:25:47 +0000 (0:00:00.042) 0:00:27.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:25:47 +0000 (0:00:00.040) 0:00:27.780 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:25:47 +0000 (0:00:00.029) 0:00:27.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:25:48 +0000 (0:00:01.048) 0:00:28.858 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:25:49 +0000 (0:00:00.968) 0:00:29.826 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:25:50 +0000 (0:00:00.697) 0:00:30.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:25:50 +0000 (0:00:00.448) 0:00:30.972 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:25:50 +0000 (0:00:00.030) 0:00:31.003 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:79 Wednesday 01 June 2022 17:25:51 +0000 (0:00:00.884) 0:00:31.887 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:25:51 +0000 (0:00:00.058) 0:00:31.945 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:25:51 +0000 (0:00:00.047) 0:00:31.993 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:25:51 +0000 (0:00:00.034) 0:00:32.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:25:52 +0000 (0:00:00.509) 0:00:32.536 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003458", "end": "2022-06-01 13:25:52.066451", "rc": 0, "start": "2022-06-01 13:25:52.062993" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:25:52 +0000 (0:00:00.505) 0:00:33.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003312", "end": "2022-06-01 13:25:52.477045", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:25:52.473733" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:25:53 +0000 (0:00:00.410) 0:00:33.451 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:25:53 +0000 (0:00:00.068) 0:00:33.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:25:53 +0000 (0:00:00.033) 0:00:33.554 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:25:53 +0000 (0:00:00.069) 0:00:33.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:25:53 +0000 (0:00:00.042) 0:00:33.666 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:25:53 +0000 (0:00:00.561) 0:00:34.228 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:25:53 +0000 (0:00:00.046) 0:00:34.275 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.042) 0:00:34.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.039) 0:00:34.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.038) 0:00:34.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.033) 0:00:34.429 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.046) 0:00:34.475 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.109) 0:00:34.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.034) 0:00:34.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.033) 0:00:34.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.033) 0:00:34.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.030) 0:00:34.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.031) 0:00:34.748 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.035) 0:00:34.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.032) 0:00:34.816 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.033) 0:00:34.849 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.063) 0:00:34.913 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.081) 0:00:34.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.033) 0:00:35.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.033) 0:00:35.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.031) 0:00:35.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.033) 0:00:35.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.031) 0:00:35.158 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.030) 0:00:35.189 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.063) 0:00:35.252 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:25:54 +0000 (0:00:00.040) 0:00:35.293 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.039) 0:00:35.332 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.065) 0:00:35.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.039) 0:00:35.437 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.043) 0:00:35.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.036) 0:00:35.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.032) 0:00:35.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.032) 0:00:35.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.031) 0:00:35.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.033) 0:00:35.647 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.067) 0:00:35.715 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.090) 0:00:35.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.033) 0:00:35.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.034) 0:00:35.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.033) 0:00:35.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.033) 0:00:35.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.033) 0:00:35.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.035) 0:00:36.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.034) 0:00:36.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.034) 0:00:36.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.032) 0:00:36.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.034) 0:00:36.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.034) 0:00:36.181 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.035) 0:00:36.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.035) 0:00:36.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:25:55 +0000 (0:00:00.033) 0:00:36.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.034) 0:00:36.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.034) 0:00:36.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.034) 0:00:36.389 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.077) 0:00:36.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.040) 0:00:36.506 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.188) 0:00:36.694 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.041) 0:00:36.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.046) 0:00:36.782 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.042) 0:00:36.825 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.039) 0:00:36.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.041) 0:00:36.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.043) 0:00:36.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.036) 0:00:36.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.033) 0:00:37.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.034) 0:00:37.054 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.051) 0:00:37.106 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.037) 0:00:37.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.039) 0:00:37.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.032) 0:00:37.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.034) 0:00:37.251 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:25:56 +0000 (0:00:00.039) 0:00:37.290 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.038) 0:00:37.329 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104346.5441215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104346.5441215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19105, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104346.5441215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.420) 0:00:37.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.039) 0:00:37.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.038) 0:00:37.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.036) 0:00:37.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.033) 0:00:37.898 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.038) 0:00:37.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.032) 0:00:37.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.032) 0:00:38.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.035) 0:00:38.037 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.045) 0:00:38.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.035) 0:00:38.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.034) 0:00:38.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.038) 0:00:38.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.036) 0:00:38.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:25:57 +0000 (0:00:00.039) 0:00:38.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.041) 0:00:38.308 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.037) 0:00:38.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.033) 0:00:38.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.032) 0:00:38.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.033) 0:00:38.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.036) 0:00:38.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.032) 0:00:38.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.030) 0:00:38.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.033) 0:00:38.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.032) 0:00:38.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.033) 0:00:38.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.035) 0:00:38.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.032) 0:00:38.712 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:25:58 +0000 (0:00:00.552) 0:00:39.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.034) 0:00:39.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.034) 0:00:39.335 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.087) 0:00:39.422 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.047) 0:00:39.469 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.046) 0:00:39.516 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.399) 0:00:39.916 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.039) 0:00:39.955 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "6442450944.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.040) 0:00:39.995 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.039) 0:00:40.035 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.036) 0:00:40.071 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:25:59 +0000 (0:00:00.043) 0:00:40.115 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.046420", "end": "2022-06-01 13:25:59.586264", "rc": 0, "start": "2022-06-01 13:25:59.539844" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.448) 0:00:40.563 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.039) 0:00:40.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.040) 0:00:40.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.034) 0:00:40.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.037) 0:00:40.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.035) 0:00:40.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.033) 0:00:40.783 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.034) 0:00:40.817 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.038) 0:00:40.855 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.141) 0:00:40.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.040) 0:00:41.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.046) 0:00:41.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.045) 0:00:41.130 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.038) 0:00:41.169 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.040) 0:00:41.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.033) 0:00:41.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:26:00 +0000 (0:00:00.033) 0:00:41.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.031) 0:00:41.307 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.034) 0:00:41.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.049) 0:00:41.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.037) 0:00:41.428 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.040) 0:00:41.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.031) 0:00:41.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.034) 0:00:41.535 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.040) 0:00:41.576 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.046) 0:00:41.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104346.2871215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104346.2871215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19071, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104346.2871215, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.410) 0:00:42.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.040) 0:00:42.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.039) 0:00:42.114 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.037) 0:00:42.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.041) 0:00:42.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.051) 0:00:42.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:26:01 +0000 (0:00:00.047) 0:00:42.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.047) 0:00:42.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.087) 0:00:42.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.057) 0:00:42.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.050) 0:00:42.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.036) 0:00:42.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.032) 0:00:42.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.033) 0:00:42.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.035) 0:00:42.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.042) 0:00:42.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.039) 0:00:42.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.035) 0:00:42.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.035) 0:00:42.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.039) 0:00:42.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.040) 0:00:42.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.034) 0:00:42.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.032) 0:00:42.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.032) 0:00:43.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.033) 0:00:43.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.035) 0:00:43.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.036) 0:00:43.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:26:02 +0000 (0:00:00.032) 0:00:43.143 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.424) 0:00:43.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.036) 0:00:43.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.035) 0:00:43.640 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.040) 0:00:43.680 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.048) 0:00:43.728 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.054) 0:00:43.783 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.396) 0:00:44.179 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.042) 0:00:44.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:26:03 +0000 (0:00:00.042) 0:00:44.265 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.040) 0:00:44.305 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.038) 0:00:44.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.046) 0:00:44.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.038822", "end": "2022-06-01 13:26:03.856456", "rc": 0, "start": "2022-06-01 13:26:03.817634" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.452) 0:00:44.843 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.043) 0:00:44.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.046) 0:00:44.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.036) 0:00:44.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.036) 0:00:45.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.034) 0:00:45.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.036) 0:00:45.078 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.034) 0:00:45.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.034) 0:00:45.147 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.031) 0:00:45.179 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:81 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.033) 0:00:45.213 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:26:04 +0000 (0:00:00.066) 0:00:45.279 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:26:05 +0000 (0:00:00.048) 0:00:45.328 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:26:05 +0000 (0:00:00.584) 0:00:45.912 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:26:05 +0000 (0:00:00.187) 0:00:46.100 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:26:05 +0000 (0:00:00.031) 0:00:46.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:26:05 +0000 (0:00:00.032) 0:00:46.164 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:26:05 +0000 (0:00:00.068) 0:00:46.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:26:05 +0000 (0:00:00.028) 0:00:46.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:26:06 +0000 (0:00:00.933) 0:00:47.194 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "mount_point": "/opt/test2", "name": "test2", "size": "40%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:26:06 +0000 (0:00:00.042) 0:00:47.237 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:26:06 +0000 (0:00:00.040) 0:00:47.278 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:26:08 +0000 (0:00:01.387) 0:00:48.666 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:26:08 +0000 (0:00:00.061) 0:00:48.727 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:26:08 +0000 (0:00:00.031) 0:00:48.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:26:08 +0000 (0:00:00.035) 0:00:48.794 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:26:08 +0000 (0:00:00.032) 0:00:48.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:26:09 +0000 (0:00:00.905) 0:00:49.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:26:11 +0000 (0:00:01.779) 0:00:51.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:26:11 +0000 (0:00:00.050) 0:00:51.563 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:26:11 +0000 (0:00:00.032) 0:00:51.595 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:26:12 +0000 (0:00:01.590) 0:00:53.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:26:12 +0000 (0:00:00.036) 0:00:53.222 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:26:12 +0000 (0:00:00.033) 0:00:53.255 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:26:13 +0000 (0:00:00.051) 0:00:53.307 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:26:13 +0000 (0:00:00.045) 0:00:53.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:26:13 +0000 (0:00:00.042) 0:00:53.394 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:26:13 +0000 (0:00:00.032) 0:00:53.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:26:13 +0000 (0:00:00.703) 0:00:54.130 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:26:14 +0000 (0:00:00.822) 0:00:54.952 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:26:15 +0000 (0:00:00.712) 0:00:55.665 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:26:15 +0000 (0:00:00.391) 0:00:56.056 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:26:15 +0000 (0:00:00.039) 0:00:56.095 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:95 Wednesday 01 June 2022 17:26:16 +0000 (0:00:00.935) 0:00:57.031 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:26:16 +0000 (0:00:00.058) 0:00:57.090 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:26:16 +0000 (0:00:00.048) 0:00:57.138 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:26:16 +0000 (0:00:00.033) 0:00:57.172 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:26:17 +0000 (0:00:00.408) 0:00:57.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003657", "end": "2022-06-01 13:26:17.007884", "rc": 0, "start": "2022-06-01 13:26:17.004227" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:26:17 +0000 (0:00:00.407) 0:00:57.988 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003056", "end": "2022-06-01 13:26:17.435411", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:26:17.432355" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.426) 0:00:58.414 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.072) 0:00:58.487 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.035) 0:00:58.523 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.067) 0:00:58.590 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.043) 0:00:58.633 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.416) 0:00:59.049 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.090) 0:00:59.140 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.042) 0:00:59.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.040) 0:00:59.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.040) 0:00:59.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:26:18 +0000 (0:00:00.033) 0:00:59.298 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.050) 0:00:59.348 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.064) 0:00:59.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.033) 0:00:59.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.034) 0:00:59.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.034) 0:00:59.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.033) 0:00:59.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.033) 0:00:59.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.039) 0:00:59.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.033) 0:00:59.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.034) 0:00:59.690 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.063) 0:00:59.754 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.080) 0:00:59.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.045) 0:00:59.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.042) 0:00:59.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.034) 0:00:59.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.037) 0:00:59.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.035) 0:01:00.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.033) 0:01:00.063 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.072) 0:01:00.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.040) 0:01:00.177 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.038) 0:01:00.216 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:26:19 +0000 (0:00:00.064) 0:01:00.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.038) 0:01:00.318 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.039) 0:01:00.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.032) 0:01:00.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.034) 0:01:00.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.033) 0:01:00.459 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.037) 0:01:00.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.035) 0:01:00.531 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.065) 0:01:00.597 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.091) 0:01:00.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.034) 0:01:00.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.034) 0:01:00.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.033) 0:01:00.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.049) 0:01:00.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.037) 0:01:00.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.035) 0:01:00.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.035) 0:01:00.950 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.034) 0:01:00.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.034) 0:01:01.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.037) 0:01:01.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.033) 0:01:01.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.035) 0:01:01.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.034) 0:01:01.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.035) 0:01:01.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:26:20 +0000 (0:00:00.033) 0:01:01.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.097) 0:01:01.327 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.034) 0:01:01.362 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.076) 0:01:01.439 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.039) 0:01:01.479 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.139) 0:01:01.618 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.036) 0:01:01.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.041) 0:01:01.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.036) 0:01:01.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.042) 0:01:01.776 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.043) 0:01:01.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.031) 0:01:01.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.032) 0:01:01.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.035) 0:01:01.919 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.033) 0:01:01.952 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.052) 0:01:02.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.038) 0:01:02.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.037) 0:01:02.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.032) 0:01:02.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.033) 0:01:02.146 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.037) 0:01:02.184 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:26:21 +0000 (0:00:00.036) 0:01:02.221 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104346.5441215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104346.5441215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19105, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104346.5441215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.400) 0:01:02.621 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.040) 0:01:02.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.038) 0:01:02.700 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.036) 0:01:02.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.032) 0:01:02.770 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.038) 0:01:02.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.036) 0:01:02.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.040) 0:01:02.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.036) 0:01:02.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.043) 0:01:02.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.033) 0:01:02.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.031) 0:01:03.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.033) 0:01:03.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.035) 0:01:03.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.033) 0:01:03.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.043) 0:01:03.175 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.045) 0:01:03.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.033) 0:01:03.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:26:22 +0000 (0:00:00.033) 0:01:03.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.036) 0:01:03.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.035) 0:01:03.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.035) 0:01:03.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.035) 0:01:03.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.044) 0:01:03.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.034) 0:01:03.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.038) 0:01:03.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.033) 0:01:03.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.034) 0:01:03.616 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.408) 0:01:04.024 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.036) 0:01:04.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.033) 0:01:04.094 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.034) 0:01:04.129 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.049) 0:01:04.179 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:26:23 +0000 (0:00:00.048) 0:01:04.227 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:26:24 +0000 (0:00:00.426) 0:01:04.653 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:26:24 +0000 (0:00:00.041) 0:01:04.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "6442450944.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:26:24 +0000 (0:00:00.040) 0:01:04.736 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:26:24 +0000 (0:00:00.039) 0:01:04.775 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:26:24 +0000 (0:00:00.038) 0:01:04.814 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:26:24 +0000 (0:00:00.043) 0:01:04.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038454", "end": "2022-06-01 13:26:24.330033", "rc": 0, "start": "2022-06-01 13:26:24.291579" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.448) 0:01:05.305 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.040) 0:01:05.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.040) 0:01:05.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.032) 0:01:05.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.047) 0:01:05.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.037) 0:01:05.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.034) 0:01:05.539 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.035) 0:01:05.575 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.039) 0:01:05.614 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.131) 0:01:05.746 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.038) 0:01:05.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 3908145152, "size_total": 4139483136, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.048) 0:01:05.833 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.039) 0:01:05.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.036) 0:01:05.909 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.038) 0:01:05.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.032) 0:01:05.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.034) 0:01:06.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.032) 0:01:06.046 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.033) 0:01:06.079 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.050) 0:01:06.129 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.037) 0:01:06.167 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.052) 0:01:06.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:26:25 +0000 (0:00:00.050) 0:01:06.271 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.034) 0:01:06.305 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.041) 0:01:06.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.040) 0:01:06.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104372.1491215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104372.1491215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19071, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104372.1491215, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.402) 0:01:06.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.041) 0:01:06.830 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.039) 0:01:06.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.079) 0:01:06.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.033) 0:01:06.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.039) 0:01:07.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.033) 0:01:07.056 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.032) 0:01:07.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.032) 0:01:07.122 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.041) 0:01:07.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.033) 0:01:07.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.033) 0:01:07.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.033) 0:01:07.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:26:26 +0000 (0:00:00.030) 0:01:07.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.030) 0:01:07.326 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.044) 0:01:07.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.047) 0:01:07.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.035) 0:01:07.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.033) 0:01:07.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.033) 0:01:07.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.035) 0:01:07.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.037) 0:01:07.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.035) 0:01:07.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.034) 0:01:07.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.037) 0:01:07.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.034) 0:01:07.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.035) 0:01:07.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.037) 0:01:07.807 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.420) 0:01:08.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:26:27 +0000 (0:00:00.037) 0:01:08.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.035) 0:01:08.300 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.036) 0:01:08.337 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "40%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.046) 0:01:08.384 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.051) 0:01:08.435 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.414) 0:01:08.850 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.043) 0:01:08.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.048) 0:01:08.942 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.036) 0:01:08.978 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.036) 0:01:09.015 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:26:28 +0000 (0:00:00.043) 0:01:09.058 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.038857", "end": "2022-06-01 13:26:28.515092", "rc": 0, "start": "2022-06-01 13:26:28.476235" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.433) 0:01:09.491 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.045) 0:01:09.536 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.042) 0:01:09.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.035) 0:01:09.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.035) 0:01:09.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.036) 0:01:09.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.034) 0:01:09.721 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.039) 0:01:09.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.035) 0:01:09.796 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.033) 0:01:09.829 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Shrink test2 volume via percentage-based size spec] ********************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:97 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.035) 0:01:09.865 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.075) 0:01:09.940 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:26:29 +0000 (0:00:00.047) 0:01:09.988 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:26:30 +0000 (0:00:00.542) 0:01:10.530 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:26:30 +0000 (0:00:00.074) 0:01:10.605 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:26:30 +0000 (0:00:00.032) 0:01:10.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:26:30 +0000 (0:00:00.032) 0:01:10.670 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:26:30 +0000 (0:00:00.064) 0:01:10.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:26:30 +0000 (0:00:00.029) 0:01:10.764 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:26:31 +0000 (0:00:00.929) 0:01:11.694 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "mount_point": "/opt/test2", "name": "test2", "size": "25%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:26:31 +0000 (0:00:00.044) 0:01:11.738 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:26:31 +0000 (0:00:00.044) 0:01:11.782 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:26:32 +0000 (0:00:01.443) 0:01:13.226 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:26:32 +0000 (0:00:00.064) 0:01:13.290 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:26:33 +0000 (0:00:00.031) 0:01:13.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:26:33 +0000 (0:00:00.032) 0:01:13.354 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:26:33 +0000 (0:00:00.030) 0:01:13.384 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:26:33 +0000 (0:00:00.893) 0:01:14.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:26:35 +0000 (0:00:01.863) 0:01:16.142 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:26:35 +0000 (0:00:00.049) 0:01:16.192 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:26:35 +0000 (0:00:00.032) 0:01:16.224 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:26:37 +0000 (0:00:02.020) 0:01:18.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:26:37 +0000 (0:00:00.034) 0:01:18.279 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:26:38 +0000 (0:00:00.071) 0:01:18.351 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:26:38 +0000 (0:00:00.045) 0:01:18.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:26:38 +0000 (0:00:00.044) 0:01:18.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:26:38 +0000 (0:00:00.036) 0:01:18.477 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:26:38 +0000 (0:00:00.031) 0:01:18.509 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:26:38 +0000 (0:00:00.695) 0:01:19.204 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:26:39 +0000 (0:00:00.799) 0:01:20.003 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:26:40 +0000 (0:00:00.688) 0:01:20.692 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:26:40 +0000 (0:00:00.416) 0:01:21.109 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:26:40 +0000 (0:00:00.033) 0:01:21.142 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:112 Wednesday 01 June 2022 17:26:41 +0000 (0:00:00.927) 0:01:22.070 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:26:41 +0000 (0:00:00.063) 0:01:22.133 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:26:41 +0000 (0:00:00.047) 0:01:22.181 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:26:41 +0000 (0:00:00.036) 0:01:22.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:26:42 +0000 (0:00:00.406) 0:01:22.624 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003179", "end": "2022-06-01 13:26:42.061356", "rc": 0, "start": "2022-06-01 13:26:42.058177" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:26:42 +0000 (0:00:00.414) 0:01:23.039 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002953", "end": "2022-06-01 13:26:42.463595", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:26:42.460642" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.402) 0:01:23.441 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.071) 0:01:23.513 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.033) 0:01:23.546 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.064) 0:01:23.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.100) 0:01:23.711 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.412) 0:01:24.123 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.046) 0:01:24.170 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.042) 0:01:24.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.041) 0:01:24.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:26:43 +0000 (0:00:00.042) 0:01:24.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.036) 0:01:24.333 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.046) 0:01:24.380 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.060) 0:01:24.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.034) 0:01:24.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.036) 0:01:24.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.034) 0:01:24.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.034) 0:01:24.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.034) 0:01:24.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.033) 0:01:24.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.035) 0:01:24.684 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.046) 0:01:24.731 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.065) 0:01:24.796 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.080) 0:01:24.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.033) 0:01:24.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.045) 0:01:24.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.035) 0:01:24.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.034) 0:01:25.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.035) 0:01:25.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.038) 0:01:25.099 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.068) 0:01:25.168 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.041) 0:01:25.210 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:26:44 +0000 (0:00:00.039) 0:01:25.249 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.076) 0:01:25.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.041) 0:01:25.367 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.047) 0:01:25.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.033) 0:01:25.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.034) 0:01:25.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.036) 0:01:25.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.034) 0:01:25.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.037) 0:01:25.590 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.069) 0:01:25.659 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.091) 0:01:25.751 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.035) 0:01:25.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.035) 0:01:25.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.036) 0:01:25.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.035) 0:01:25.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.035) 0:01:25.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.033) 0:01:25.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.032) 0:01:25.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.033) 0:01:26.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.082) 0:01:26.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.038) 0:01:26.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.036) 0:01:26.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.034) 0:01:26.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.033) 0:01:26.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:26:45 +0000 (0:00:00.034) 0:01:26.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.041) 0:01:26.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.033) 0:01:26.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.035) 0:01:26.400 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.074) 0:01:26.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.055) 0:01:26.531 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.149) 0:01:26.680 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.042) 0:01:26.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1551090, "block_size": 4096, "block_total": 1570304, "block_used": 19214, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3145725, "inode_total": 3145728, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6353264640, "size_total": 6431965184, "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.047) 0:01:26.769 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.041) 0:01:26.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.040) 0:01:26.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.044) 0:01:26.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.036) 0:01:26.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.039) 0:01:26.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.034) 0:01:27.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.034) 0:01:27.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.054) 0:01:27.096 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.038) 0:01:27.134 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.040) 0:01:27.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.033) 0:01:27.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.033) 0:01:27.242 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:26:46 +0000 (0:00:00.040) 0:01:27.282 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.044) 0:01:27.326 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104346.5441215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104346.5441215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19105, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104346.5441215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.405) 0:01:27.731 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.041) 0:01:27.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.039) 0:01:27.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.039) 0:01:27.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.033) 0:01:27.885 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.039) 0:01:27.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.035) 0:01:27.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.031) 0:01:27.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.031) 0:01:28.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.040) 0:01:28.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.032) 0:01:28.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.032) 0:01:28.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.035) 0:01:28.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.037) 0:01:28.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.035) 0:01:28.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:26:47 +0000 (0:00:00.040) 0:01:28.277 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.036) 0:01:28.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.030) 0:01:28.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.034) 0:01:28.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.030) 0:01:28.409 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.031) 0:01:28.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.032) 0:01:28.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.032) 0:01:28.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.030) 0:01:28.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.034) 0:01:28.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.081) 0:01:28.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.033) 0:01:28.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.032) 0:01:28.718 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.396) 0:01:29.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.034) 0:01:29.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.033) 0:01:29.183 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.037) 0:01:29.220 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:26:48 +0000 (0:00:00.055) 0:01:29.276 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:26:49 +0000 (0:00:00.047) 0:01:29.323 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:26:49 +0000 (0:00:00.393) 0:01:29.716 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:26:49 +0000 (0:00:00.039) 0:01:29.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "6442450944.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:26:49 +0000 (0:00:00.040) 0:01:29.796 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:26:49 +0000 (0:00:00.037) 0:01:29.833 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:26:49 +0000 (0:00:00.038) 0:01:29.871 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:26:49 +0000 (0:00:00.043) 0:01:29.915 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.031271", "end": "2022-06-01 13:26:49.371902", "rc": 0, "start": "2022-06-01 13:26:49.340631" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.436) 0:01:30.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.040) 0:01:30.393 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.040) 0:01:30.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.033) 0:01:30.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.035) 0:01:30.502 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.033) 0:01:30.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.033) 0:01:30.569 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.032) 0:01:30.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.036) 0:01:30.637 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.143) 0:01:30.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.046) 0:01:30.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.050) 0:01:30.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.042) 0:01:30.921 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.038) 0:01:30.960 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.042) 0:01:31.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.035) 0:01:31.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.036) 0:01:31.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.035) 0:01:31.109 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.037) 0:01:31.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.056) 0:01:31.203 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.037) 0:01:31.241 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:26:50 +0000 (0:00:00.039) 0:01:31.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.031) 0:01:31.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.033) 0:01:31.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.041) 0:01:31.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.041) 0:01:31.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104397.1631215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104397.1631215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19071, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104397.1631215, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.428) 0:01:31.858 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.041) 0:01:31.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.041) 0:01:31.941 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.038) 0:01:31.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.033) 0:01:32.013 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.039) 0:01:32.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.032) 0:01:32.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.030) 0:01:32.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.032) 0:01:32.149 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.043) 0:01:32.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.033) 0:01:32.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.033) 0:01:32.260 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:26:51 +0000 (0:00:00.033) 0:01:32.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.033) 0:01:32.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.032) 0:01:32.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.042) 0:01:32.403 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.037) 0:01:32.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.032) 0:01:32.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.031) 0:01:32.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.030) 0:01:32.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.031) 0:01:32.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.033) 0:01:32.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.032) 0:01:32.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.031) 0:01:32.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.031) 0:01:32.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.030) 0:01:32.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.033) 0:01:32.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.036) 0:01:32.797 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2684354560, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.432) 0:01:33.229 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:26:52 +0000 (0:00:00.035) 0:01:33.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.038) 0:01:33.303 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "6442450944.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.038) 0:01:33.342 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.048) 0:01:33.390 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "dadf5e57-476e-4f97-9a0d-dd2027e64e5c" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.055) 0:01:33.446 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.395) 0:01:33.842 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.040) 0:01:33.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2684354560.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.038) 0:01:33.921 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2684354560, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.033) 0:01:33.955 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.032) 0:01:33.988 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:26:53 +0000 (0:00:00.047) 0:01:34.035 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.039229", "end": "2022-06-01 13:26:53.490001", "rc": 0, "start": "2022-06-01 13:26:53.450772" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.444) 0:01:34.480 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.052) 0:01:34.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.055) 0:01:34.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.038) 0:01:34.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.034) 0:01:34.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.035) 0:01:34.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.039) 0:01:34.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.034) 0:01:34.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.034) 0:01:34.805 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.031) 0:01:34.836 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Get the size of test2 volume] ******************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:114 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.034) 0:01:34.870 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lsblk", "--noheadings", "-o", "SIZE", "/dev/mapper/foo-test2" ], "delta": "0:00:00.004214", "end": "2022-06-01 13:26:54.281861", "rc": 0, "start": "2022-06-01 13:26:54.277647" } STDOUT: 2.5G TASK [Remove the test1 volume without changing its size] *********************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:119 Wednesday 01 June 2022 17:26:54 +0000 (0:00:00.388) 0:01:35.259 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.141) 0:01:35.400 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.048) 0:01:35.449 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.570) 0:01:36.020 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.078) 0:01:36.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.033) 0:01:36.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.034) 0:01:36.166 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.067) 0:01:36.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:26:55 +0000 (0:00:00.028) 0:01:36.262 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:26:56 +0000 (0:00:00.912) 0:01:37.174 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%", "state": "absent" }, { "mount_point": "/opt/test2", "name": "test2", "size": "25%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:26:56 +0000 (0:00:00.041) 0:01:37.216 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:26:56 +0000 (0:00:00.038) 0:01:37.255 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:26:58 +0000 (0:00:01.385) 0:01:38.641 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:26:58 +0000 (0:00:00.058) 0:01:38.699 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:26:58 +0000 (0:00:00.028) 0:01:38.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:26:58 +0000 (0:00:00.032) 0:01:38.760 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:26:58 +0000 (0:00:00.031) 0:01:38.791 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:26:59 +0000 (0:00:00.882) 0:01:39.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:27:01 +0000 (0:00:01.757) 0:01:41.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:27:01 +0000 (0:00:00.047) 0:01:41.478 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:27:01 +0000 (0:00:00.030) 0:01:41.509 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:27:03 +0000 (0:00:01.956) 0:01:43.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:27:03 +0000 (0:00:00.032) 0:01:43.497 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:27:03 +0000 (0:00:00.031) 0:01:43.528 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:27:03 +0000 (0:00:00.049) 0:01:43.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:27:03 +0000 (0:00:00.044) 0:01:43.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:27:03 +0000 (0:00:00.037) 0:01:43.660 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:27:03 +0000 (0:00:00.410) 0:01:44.071 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:27:04 +0000 (0:00:00.681) 0:01:44.753 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:27:04 +0000 (0:00:00.411) 0:01:45.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:27:05 +0000 (0:00:00.691) 0:01:45.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:27:05 +0000 (0:00:00.406) 0:01:46.263 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:27:05 +0000 (0:00:00.033) 0:01:46.296 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:136 Wednesday 01 June 2022 17:27:08 +0000 (0:00:02.596) 0:01:48.892 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:27:08 +0000 (0:00:00.068) 0:01:48.961 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:27:08 +0000 (0:00:00.049) 0:01:49.010 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:27:08 +0000 (0:00:00.033) 0:01:49.044 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:27:09 +0000 (0:00:00.428) 0:01:49.473 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003132", "end": "2022-06-01 13:27:08.905024", "rc": 0, "start": "2022-06-01 13:27:08.901892" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:27:09 +0000 (0:00:00.408) 0:01:49.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003573", "end": "2022-06-01 13:27:09.305158", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:27:09.301585" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:27:09 +0000 (0:00:00.401) 0:01:50.283 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.073) 0:01:50.357 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.037) 0:01:50.394 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.112) 0:01:50.506 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.041) 0:01:50.548 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.404) 0:01:50.953 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.054) 0:01:51.007 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.040) 0:01:51.048 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.039) 0:01:51.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.046) 0:01:51.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.032) 0:01:51.166 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.046) 0:01:51.212 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:27:10 +0000 (0:00:00.061) 0:01:51.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.033) 0:01:51.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.032) 0:01:51.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.032) 0:01:51.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.031) 0:01:51.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.031) 0:01:51.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.037) 0:01:51.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.034) 0:01:51.507 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.033) 0:01:51.541 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.076) 0:01:51.617 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.080) 0:01:51.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.035) 0:01:51.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.034) 0:01:51.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.034) 0:01:51.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.033) 0:01:51.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.036) 0:01:51.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.035) 0:01:51.907 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.077) 0:01:51.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.039) 0:01:52.025 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.041) 0:01:52.066 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.067) 0:01:52.134 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.038) 0:01:52.173 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.040) 0:01:52.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.033) 0:01:52.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:27:11 +0000 (0:00:00.036) 0:01:52.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.033) 0:01:52.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.034) 0:01:52.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.035) 0:01:52.388 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.069) 0:01:52.457 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.087) 0:01:52.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.032) 0:01:52.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.032) 0:01:52.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.032) 0:01:52.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.034) 0:01:52.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.033) 0:01:52.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.033) 0:01:52.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.032) 0:01:52.777 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.099) 0:01:52.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.037) 0:01:52.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.034) 0:01:52.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.035) 0:01:52.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.032) 0:01:53.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.032) 0:01:53.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.035) 0:01:53.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.034) 0:01:53.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.034) 0:01:53.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.036) 0:01:53.190 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:27:12 +0000 (0:00:00.083) 0:01:53.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.038) 0:01:53.311 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.135) 0:01:53.447 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.038) 0:01:53.486 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.043) 0:01:53.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.031) 0:01:53.561 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.042) 0:01:53.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.033) 0:01:53.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.030) 0:01:53.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.032) 0:01:53.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.031) 0:01:53.731 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.032) 0:01:53.764 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.049) 0:01:53.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.034) 0:01:53.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.043) 0:01:53.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.031) 0:01:53.924 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.035) 0:01:53.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.033) 0:01:53.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:27:13 +0000 (0:00:00.033) 0:01:54.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.394) 0:01:54.420 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.040) 0:01:54.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.030) 0:01:54.492 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.037) 0:01:54.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.035) 0:01:54.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.028) 0:01:54.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.035) 0:01:54.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.032) 0:01:54.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.032) 0:01:54.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.027) 0:01:54.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.031) 0:01:54.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.033) 0:01:54.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.036) 0:01:54.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.033) 0:01:54.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.033) 0:01:54.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.040) 0:01:54.930 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.039) 0:01:54.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.032) 0:01:55.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.033) 0:01:55.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.037) 0:01:55.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.035) 0:01:55.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.033) 0:01:55.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.034) 0:01:55.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.033) 0:01:55.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:27:14 +0000 (0:00:00.036) 0:01:55.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.084) 0:01:55.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.034) 0:01:55.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:55.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.032) 0:01:55.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.032) 0:01:55.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.034) 0:01:55.500 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.038) 0:01:55.538 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.032) 0:01:55.571 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.032) 0:01:55.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.034) 0:01:55.638 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.035) 0:01:55.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:55.708 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.042) 0:01:55.750 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.038) 0:01:55.789 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:55.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:55.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.034) 0:01:55.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:55.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.035) 0:01:55.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:55.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.032) 0:01:56.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:56.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.033) 0:01:56.094 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.039) 0:01:56.134 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:27:15 +0000 (0:00:00.135) 0:01:56.269 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.038) 0:01:56.308 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 628684, "block_used": 39935, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 2411515904, "size_total": 2575089664, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.046) 0:01:56.354 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.037) 0:01:56.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.035) 0:01:56.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.038) 0:01:56.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.034) 0:01:56.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.032) 0:01:56.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.033) 0:01:56.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.033) 0:01:56.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.049) 0:01:56.650 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.037) 0:01:56.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.042) 0:01:56.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.035) 0:01:56.766 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.035) 0:01:56.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.041) 0:01:56.843 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.039) 0:01:56.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104397.1631215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104397.1631215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19071, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104397.1631215, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:27:16 +0000 (0:00:00.411) 0:01:57.294 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.040) 0:01:57.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.040) 0:01:57.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.034) 0:01:57.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.030) 0:01:57.439 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.034) 0:01:57.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.031) 0:01:57.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.030) 0:01:57.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.033) 0:01:57.569 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.039) 0:01:57.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.030) 0:01:57.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.030) 0:01:57.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.030) 0:01:57.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.032) 0:01:57.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.036) 0:01:57.769 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.098) 0:01:57.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.037) 0:01:57.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.033) 0:01:57.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.036) 0:01:57.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.032) 0:01:58.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.033) 0:01:58.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.034) 0:01:58.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.036) 0:01:58.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.039) 0:01:58.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.039) 0:01:58.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.043) 0:01:58.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:27:17 +0000 (0:00:00.035) 0:01:58.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:27:18 +0000 (0:00:00.037) 0:01:58.309 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2684354560, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:27:18 +0000 (0:00:00.419) 0:01:58.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:27:18 +0000 (0:00:00.033) 0:01:58.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:27:18 +0000 (0:00:00.034) 0:01:58.797 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:27:18 +0000 (0:00:00.038) 0:01:58.835 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "60%", "state": "absent", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "25%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:27:18 +0000 (0:00:00.048) 0:01:58.884 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:27:18 +0000 (0:00:00.050) 0:01:58.935 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.404) 0:01:59.339 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.042) 0:01:59.382 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2684354560.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.040) 0:01:59.422 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2684354560, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.038) 0:01:59.460 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.038) 0:01:59.499 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.041) 0:01:59.540 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.046385", "end": "2022-06-01 13:27:19.024081", "rc": 0, "start": "2022-06-01 13:27:18.977696" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.460) 0:02:00.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.039) 0:02:00.040 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.043) 0:02:00.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.034) 0:02:00.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.034) 0:02:00.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.036) 0:02:00.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.035) 0:02:00.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:27:19 +0000 (0:00:00.040) 0:02:00.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:27:20 +0000 (0:00:00.033) 0:02:00.298 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:27:20 +0000 (0:00:00.032) 0:02:00.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Get the size of test2 volume again] ************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:138 Wednesday 01 June 2022 17:27:20 +0000 (0:00:00.033) 0:02:00.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lsblk", "--noheadings", "-o", "SIZE", "/dev/mapper/foo-test2" ], "delta": "0:00:00.004683", "end": "2022-06-01 13:27:19.797351", "rc": 0, "start": "2022-06-01 13:27:19.792668" } STDOUT: 2.5G TASK [Verify that removing test1 didn't cause a change in test2 size] ********** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:143 Wednesday 01 June 2022 17:27:20 +0000 (0:00:00.408) 0:02:00.774 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Grow test2 using a percentage-based size spec] *************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:147 Wednesday 01 June 2022 17:27:20 +0000 (0:00:00.037) 0:02:00.812 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:27:20 +0000 (0:00:00.093) 0:02:00.905 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:27:20 +0000 (0:00:00.047) 0:02:00.953 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:27:21 +0000 (0:00:00.550) 0:02:01.503 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:27:21 +0000 (0:00:00.075) 0:02:01.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:27:21 +0000 (0:00:00.033) 0:02:01.612 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:27:21 +0000 (0:00:00.032) 0:02:01.644 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:27:21 +0000 (0:00:00.068) 0:02:01.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:27:21 +0000 (0:00:00.026) 0:02:01.739 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:27:22 +0000 (0:00:00.947) 0:02:02.687 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test2", "name": "test2", "size": "50%" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:27:22 +0000 (0:00:00.040) 0:02:02.727 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:27:22 +0000 (0:00:00.035) 0:02:02.763 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:27:23 +0000 (0:00:01.305) 0:02:04.069 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:27:23 +0000 (0:00:00.058) 0:02:04.127 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:27:23 +0000 (0:00:00.031) 0:02:04.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:27:23 +0000 (0:00:00.032) 0:02:04.192 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:27:23 +0000 (0:00:00.031) 0:02:04.223 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:27:24 +0000 (0:00:00.863) 0:02:05.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:27:26 +0000 (0:00:01.741) 0:02:06.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:27:26 +0000 (0:00:00.049) 0:02:06.877 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:27:26 +0000 (0:00:00.030) 0:02:06.907 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:27:28 +0000 (0:00:01.823) 0:02:08.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:27:28 +0000 (0:00:00.032) 0:02:08.764 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:27:28 +0000 (0:00:00.030) 0:02:08.794 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/mapper/foo-test2", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:27:28 +0000 (0:00:00.042) 0:02:08.837 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:27:28 +0000 (0:00:00.085) 0:02:08.923 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:27:28 +0000 (0:00:00.037) 0:02:08.960 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:27:28 +0000 (0:00:00.031) 0:02:08.991 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:27:29 +0000 (0:00:00.697) 0:02:09.689 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:27:29 +0000 (0:00:00.435) 0:02:10.125 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:27:30 +0000 (0:00:00.687) 0:02:10.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:27:30 +0000 (0:00:00.409) 0:02:11.222 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:27:30 +0000 (0:00:00.034) 0:02:11.256 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:160 Wednesday 01 June 2022 17:27:31 +0000 (0:00:00.884) 0:02:12.140 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:27:31 +0000 (0:00:00.072) 0:02:12.213 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:27:31 +0000 (0:00:00.049) 0:02:12.262 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:27:31 +0000 (0:00:00.033) 0:02:12.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "5G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:27:32 +0000 (0:00:00.420) 0:02:12.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003220", "end": "2022-06-01 13:27:32.154685", "rc": 0, "start": "2022-06-01 13:27:32.151465" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:27:32 +0000 (0:00:00.416) 0:02:13.133 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003125", "end": "2022-06-01 13:27:32.569732", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:27:32.566607" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.412) 0:02:13.546 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.069) 0:02:13.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.036) 0:02:13.652 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.067) 0:02:13.719 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.045) 0:02:13.765 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.411) 0:02:14.177 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.044) 0:02:14.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:27:33 +0000 (0:00:00.045) 0:02:14.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.089) 0:02:14.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.040) 0:02:14.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.032) 0:02:14.431 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.047) 0:02:14.478 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.062) 0:02:14.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.036) 0:02:14.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.034) 0:02:14.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.032) 0:02:14.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.033) 0:02:14.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.034) 0:02:14.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.049) 0:02:14.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.039) 0:02:14.801 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.036) 0:02:14.838 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.064) 0:02:14.902 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.068) 0:02:14.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.033) 0:02:15.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.035) 0:02:15.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.033) 0:02:15.073 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.071) 0:02:15.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.040) 0:02:15.185 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.039) 0:02:15.225 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:27:34 +0000 (0:00:00.062) 0:02:15.288 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.041) 0:02:15.329 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.040) 0:02:15.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.031) 0:02:15.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.031) 0:02:15.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.037) 0:02:15.536 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.074) 0:02:15.611 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.070) 0:02:15.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.034) 0:02:15.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.035) 0:02:15.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.920 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.033) 0:02:15.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.034) 0:02:15.988 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.066) 0:02:16.054 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.040) 0:02:16.095 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.132) 0:02:16.228 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:27:35 +0000 (0:00:00.049) 0:02:16.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1200036, "block_size": 4096, "block_total": 1273760, "block_used": 73724, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 4915347456, "size_total": 5217320960, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1200036, "block_size": 4096, "block_total": 1273760, "block_used": 73724, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime", "size_available": 4915347456, "size_total": 5217320960, "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.049) 0:02:16.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.039) 0:02:16.367 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.111) 0:02:16.478 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.045) 0:02:16.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.032) 0:02:16.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.034) 0:02:16.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.032) 0:02:16.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.032) 0:02:16.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.051) 0:02:16.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.038) 0:02:16.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.040) 0:02:16.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.034) 0:02:16.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.034) 0:02:16.855 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.044) 0:02:16.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:27:36 +0000 (0:00:00.042) 0:02:16.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104447.7021215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104447.7021215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19071, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104447.7021215, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.412) 0:02:17.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.039) 0:02:17.393 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.038) 0:02:17.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.035) 0:02:17.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.036) 0:02:17.503 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.047) 0:02:17.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.034) 0:02:17.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.035) 0:02:17.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.033) 0:02:17.653 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.044) 0:02:17.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.033) 0:02:17.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.037) 0:02:17.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.036) 0:02:17.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.036) 0:02:17.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.035) 0:02:17.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.047) 0:02:17.925 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.039) 0:02:17.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.037) 0:02:18.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.037) 0:02:18.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.039) 0:02:18.080 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.036) 0:02:18.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.035) 0:02:18.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.037) 0:02:18.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.041) 0:02:18.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.034) 0:02:18.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:27:37 +0000 (0:00:00.033) 0:02:18.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.035) 0:02:18.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.032) 0:02:18.367 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.419) 0:02:18.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.032) 0:02:18.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.036) 0:02:18.855 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2684354560.0" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.042) 0:02:18.897 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "50%", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.047) 0:02:18.945 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "5G", "type": "lvm", "uuid": "a2df7b0f-c770-4314-b068-d506094efc14" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f2IJ8b-Qyp9-63Cd-7XV5-xc7R-O0Mh-TbWXN0" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:27:38 +0000 (0:00:00.048) 0:02:18.993 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.407) 0:02:19.400 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_pool_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.041) 0:02:19.442 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120.0" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.040) 0:02:19.482 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.037) 0:02:19.520 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120.0" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.035) 0:02:19.555 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.043) 0:02:19.599 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.040340", "end": "2022-06-01 13:27:39.061042", "rc": 0, "start": "2022-06-01 13:27:39.020702" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.443) 0:02:20.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.041) 0:02:20.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.041) 0:02:20.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.035) 0:02:20.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.092) 0:02:20.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:27:39 +0000 (0:00:00.037) 0:02:20.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.036) 0:02:20.328 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.034) 0:02:20.362 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.031) 0:02:20.394 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.027) 0:02:20.421 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove both of the LVM logical volumes in 'foo' created above] *********** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:162 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.035) 0:02:20.457 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.094) 0:02:20.551 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.050) 0:02:20.602 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.568) 0:02:21.171 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.075) 0:02:21.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:27:40 +0000 (0:00:00.035) 0:02:21.282 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:27:41 +0000 (0:00:00.034) 0:02:21.316 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:27:41 +0000 (0:00:00.066) 0:02:21.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:27:41 +0000 (0:00:00.030) 0:02:21.413 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:27:42 +0000 (0:00:00.944) 0:02:22.358 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:27:42 +0000 (0:00:00.038) 0:02:22.397 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:27:42 +0000 (0:00:00.035) 0:02:22.432 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:27:43 +0000 (0:00:01.316) 0:02:23.749 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:27:43 +0000 (0:00:00.067) 0:02:23.816 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:27:43 +0000 (0:00:00.030) 0:02:23.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:27:43 +0000 (0:00:00.034) 0:02:23.881 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:27:43 +0000 (0:00:00.031) 0:02:23.912 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:27:44 +0000 (0:00:00.852) 0:02:24.765 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:27:46 +0000 (0:00:02.044) 0:02:26.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:27:46 +0000 (0:00:00.050) 0:02:26.861 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:27:46 +0000 (0:00:00.032) 0:02:26.893 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:27:48 +0000 (0:00:01.916) 0:02:28.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:27:48 +0000 (0:00:00.033) 0:02:28.843 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:27:48 +0000 (0:00:00.029) 0:02:28.872 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:27:48 +0000 (0:00:00.049) 0:02:28.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:27:48 +0000 (0:00:00.040) 0:02:28.963 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:27:48 +0000 (0:00:00.037) 0:02:29.001 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test2', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:27:49 +0000 (0:00:00.425) 0:02:29.426 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:27:49 +0000 (0:00:00.663) 0:02:30.090 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:27:49 +0000 (0:00:00.033) 0:02:30.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:27:50 +0000 (0:00:00.681) 0:02:30.805 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:27:50 +0000 (0:00:00.403) 0:02:31.209 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:27:50 +0000 (0:00:00.032) 0:02:31.242 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml:171 Wednesday 01 June 2022 17:27:51 +0000 (0:00:00.880) 0:02:32.122 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:27:51 +0000 (0:00:00.073) 0:02:32.196 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:27:51 +0000 (0:00:00.044) 0:02:32.241 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:27:51 +0000 (0:00:00.035) 0:02:32.276 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:27:52 +0000 (0:00:00.401) 0:02:32.678 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002813", "end": "2022-06-01 13:27:52.094202", "rc": 0, "start": "2022-06-01 13:27:52.091389" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:27:52 +0000 (0:00:00.396) 0:02:33.074 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002804", "end": "2022-06-01 13:27:52.498304", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:27:52.495500" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.403) 0:02:33.478 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.061) 0:02:33.539 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.035) 0:02:33.574 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.119) 0:02:33.694 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.041) 0:02:33.736 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.030) 0:02:33.767 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.030) 0:02:33.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.043) 0:02:33.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.038) 0:02:33.879 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.038) 0:02:33.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.031) 0:02:33.950 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.029) 0:02:33.980 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.061) 0:02:34.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.034) 0:02:34.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.035) 0:02:34.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.030) 0:02:34.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.034) 0:02:34.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.036) 0:02:34.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.032) 0:02:34.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:27:53 +0000 (0:00:00.031) 0:02:34.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.035) 0:02:34.313 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.059) 0:02:34.372 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.033) 0:02:34.406 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.078) 0:02:34.484 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.039) 0:02:34.524 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.032) 0:02:34.556 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.037) 0:02:34.594 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.034) 0:02:34.628 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.067) 0:02:34.696 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.031) 0:02:34.728 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.033) 0:02:34.762 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.032) 0:02:34.795 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.032) 0:02:34.827 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.033) 0:02:34.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=773 changed=10 unreachable=0 failed=1 skipped=529 rescued=1 ignored=0 Wednesday 01 June 2022 17:27:54 +0000 (0:00:00.019) 0:02:34.880 ******** =============================================================================== linux-system-roles.storage : Update facts ------------------------------- 2.60s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 2.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.96s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.76s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 1.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get required packages ---------------------- 1.44s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 1.39s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 1.39s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : get required packages ---------------------- 1.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:27:55 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:27:56 +0000 (0:00:01.388) 0:00:01.412 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.39s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_misc.yml ******************************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_misc.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:2 Wednesday 01 June 2022 17:27:56 +0000 (0:00:00.029) 0:00:01.441 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:16 Wednesday 01 June 2022 17:27:57 +0000 (0:00:01.143) 0:00:02.584 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:27:57 +0000 (0:00:00.042) 0:00:02.627 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:27:58 +0000 (0:00:00.169) 0:00:02.796 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:27:58 +0000 (0:00:00.542) 0:00:03.339 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:27:58 +0000 (0:00:00.077) 0:00:03.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:27:58 +0000 (0:00:00.022) 0:00:03.439 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:27:58 +0000 (0:00:00.021) 0:00:03.460 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:27:59 +0000 (0:00:00.196) 0:00:03.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:27:59 +0000 (0:00:00.018) 0:00:03.676 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:00 +0000 (0:00:01.121) 0:00:04.797 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:00 +0000 (0:00:00.046) 0:00:04.844 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:00 +0000 (0:00:00.046) 0:00:04.891 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:00 +0000 (0:00:00.721) 0:00:05.612 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:28:01 +0000 (0:00:00.083) 0:00:05.696 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:28:01 +0000 (0:00:00.021) 0:00:05.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:28:01 +0000 (0:00:00.022) 0:00:05.740 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:01 +0000 (0:00:00.021) 0:00:05.761 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:01 +0000 (0:00:00.839) 0:00:06.601 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:03 +0000 (0:00:01.859) 0:00:08.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:03 +0000 (0:00:00.042) 0:00:08.503 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:03 +0000 (0:00:00.027) 0:00:08.531 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.536) 0:00:09.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.030) 0:00:09.099 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.027) 0:00:09.127 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.034) 0:00:09.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.033) 0:00:09.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.071) 0:00:09.266 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.030) 0:00:09.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.030) 0:00:09.328 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.029) 0:00:09.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:28:04 +0000 (0:00:00.029) 0:00:09.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:28:05 +0000 (0:00:00.530) 0:00:09.917 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:28:05 +0000 (0:00:00.031) 0:00:09.948 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:19 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.882) 0:00:10.830 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:26 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.031) 0:00:10.862 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.043) 0:00:10.905 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.547) 0:00:11.453 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.037) 0:00:11.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.030) 0:00:11.521 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Test creating ext4 filesystem with valid parameter "-Fb 4096"] *********** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:31 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.033) 0:00:11.554 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:28:06 +0000 (0:00:00.056) 0:00:11.611 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.048) 0:00:11.659 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.516) 0:00:12.175 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.077) 0:00:12.253 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.031) 0:00:12.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.031) 0:00:12.316 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.063) 0:00:12.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.026) 0:00:12.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.031) 0:00:12.438 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_create_options": "-Fb 4096", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.037) 0:00:12.475 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.035) 0:00:12.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.033) 0:00:12.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:07 +0000 (0:00:00.036) 0:00:12.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:08 +0000 (0:00:00.062) 0:00:12.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:08 +0000 (0:00:00.029) 0:00:12.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:08 +0000 (0:00:00.041) 0:00:12.714 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:08 +0000 (0:00:00.026) 0:00:12.741 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:28:09 +0000 (0:00:01.793) 0:00:14.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:09 +0000 (0:00:00.035) 0:00:14.569 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:28:09 +0000 (0:00:00.029) 0:00:14.598 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:28:09 +0000 (0:00:00.038) 0:00:14.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:28:10 +0000 (0:00:00.036) 0:00:14.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:28:10 +0000 (0:00:00.035) 0:00:14.709 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:28:10 +0000 (0:00:00.029) 0:00:14.738 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:28:11 +0000 (0:00:00.959) 0:00:15.698 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:28:11 +0000 (0:00:00.560) 0:00:16.258 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:28:12 +0000 (0:00:00.697) 0:00:16.955 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:28:12 +0000 (0:00:00.421) 0:00:17.377 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:28:12 +0000 (0:00:00.030) 0:00:17.408 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:46 Wednesday 01 June 2022 17:28:13 +0000 (0:00:00.900) 0:00:18.308 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:28:13 +0000 (0:00:00.052) 0:00:18.361 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:28:13 +0000 (0:00:00.040) 0:00:18.401 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:28:13 +0000 (0:00:00.030) 0:00:18.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "56e0aee4-ec68-4a1b-afaf-628a98ee61bf" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4lgv8C-SeM6-Ysuo-sKHY-QcGf-t2M7-f81Ufv" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:28:14 +0000 (0:00:00.537) 0:00:18.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002959", "end": "2022-06-01 13:28:14.163898", "rc": 0, "start": "2022-06-01 13:28:14.160939" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:28:14 +0000 (0:00:00.511) 0:00:19.482 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003487", "end": "2022-06-01 13:28:14.548256", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:28:14.544769" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:28:15 +0000 (0:00:00.386) 0:00:19.868 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:28:15 +0000 (0:00:00.071) 0:00:19.940 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:28:15 +0000 (0:00:00.030) 0:00:19.971 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:28:15 +0000 (0:00:00.063) 0:00:20.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:28:15 +0000 (0:00:00.043) 0:00:20.077 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:28:15 +0000 (0:00:00.508) 0:00:20.586 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:28:15 +0000 (0:00:00.044) 0:00:20.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.039) 0:00:20.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.037) 0:00:20.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.037) 0:00:20.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.042) 0:00:20.787 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.047) 0:00:20.835 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.059) 0:00:20.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.031) 0:00:20.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.034) 0:00:20.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.032) 0:00:20.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.031) 0:00:21.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.032) 0:00:21.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.034) 0:00:21.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.032) 0:00:21.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.034) 0:00:21.159 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.058) 0:00:21.218 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.064) 0:00:21.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.033) 0:00:21.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.033) 0:00:21.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.031) 0:00:21.381 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.064) 0:00:21.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.042) 0:00:21.488 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.034) 0:00:21.522 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.056) 0:00:21.579 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:28:16 +0000 (0:00:00.037) 0:00:21.617 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.038) 0:00:21.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.030) 0:00:21.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.028) 0:00:21.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.028) 0:00:21.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.030) 0:00:21.775 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.029) 0:00:21.804 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.111) 0:00:21.915 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.067) 0:00:21.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.032) 0:00:22.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.035) 0:00:22.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.031) 0:00:22.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.031) 0:00:22.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.031) 0:00:22.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.031) 0:00:22.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.032) 0:00:22.211 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.035) 0:00:22.247 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.033) 0:00:22.280 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.060) 0:00:22.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.036) 0:00:22.378 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.132) 0:00:22.510 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.040) 0:00:22.550 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "56e0aee4-ec68-4a1b-afaf-628a98ee61bf" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "56e0aee4-ec68-4a1b-afaf-628a98ee61bf" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.044) 0:00:22.595 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:28:17 +0000 (0:00:00.037) 0:00:22.633 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.036) 0:00:22.670 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.043) 0:00:22.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.031) 0:00:22.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.030) 0:00:22.776 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.028) 0:00:22.804 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.029) 0:00:22.833 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.045) 0:00:22.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.037) 0:00:22.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.038) 0:00:22.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.032) 0:00:22.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.033) 0:00:23.020 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.047) 0:00:23.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.042) 0:00:23.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104489.1501215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104489.1501215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19626, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104489.1501215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.460) 0:00:23.571 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:28:18 +0000 (0:00:00.045) 0:00:23.616 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.037) 0:00:23.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.036) 0:00:23.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.034) 0:00:23.725 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.035) 0:00:23.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.030) 0:00:23.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.028) 0:00:23.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.027) 0:00:23.848 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.035) 0:00:23.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.028) 0:00:23.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.031) 0:00:23.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.034) 0:00:23.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.038) 0:00:24.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.034) 0:00:24.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.040) 0:00:24.092 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.037) 0:00:24.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.032) 0:00:24.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.034) 0:00:24.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.077) 0:00:24.275 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.031) 0:00:24.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.030) 0:00:24.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.029) 0:00:24.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.029) 0:00:24.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.032) 0:00:24.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.034) 0:00:24.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.033) 0:00:24.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:28:19 +0000 (0:00:00.032) 0:00:24.529 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.483) 0:00:25.013 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.401) 0:00:25.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.037) 0:00:25.452 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.033) 0:00:25.486 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.032) 0:00:25.518 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.030) 0:00:25.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.031) 0:00:25.580 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:28:20 +0000 (0:00:00.030) 0:00:25.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.033) 0:00:25.645 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.036) 0:00:25.681 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.036) 0:00:25.717 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.039) 0:00:25.756 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.044756", "end": "2022-06-01 13:28:20.877326", "rc": 0, "start": "2022-06-01 13:28:20.832570" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.440) 0:00:26.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.040) 0:00:26.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.040) 0:00:26.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.033) 0:00:26.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.034) 0:00:26.347 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.034) 0:00:26.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.035) 0:00:26.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.033) 0:00:26.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.032) 0:00:26.483 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.040) 0:00:26.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the volume group created above] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:48 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.042) 0:00:26.566 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:28:21 +0000 (0:00:00.067) 0:00:26.634 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.047) 0:00:26.681 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.557) 0:00:27.239 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.080) 0:00:27.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.032) 0:00:27.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.030) 0:00:27.383 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.063) 0:00:27.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.028) 0:00:27.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.034) 0:00:27.509 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:22 +0000 (0:00:00.046) 0:00:27.556 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:23 +0000 (0:00:00.084) 0:00:27.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:23 +0000 (0:00:00.033) 0:00:27.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:23 +0000 (0:00:00.033) 0:00:27.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:23 +0000 (0:00:00.032) 0:00:27.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:23 +0000 (0:00:00.031) 0:00:27.772 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:23 +0000 (0:00:00.049) 0:00:27.822 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:23 +0000 (0:00:00.029) 0:00:27.851 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:28:25 +0000 (0:00:01.889) 0:00:29.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:25 +0000 (0:00:00.038) 0:00:29.779 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:28:25 +0000 (0:00:00.030) 0:00:29.809 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:28:25 +0000 (0:00:00.042) 0:00:29.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:28:25 +0000 (0:00:00.040) 0:00:29.892 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:28:25 +0000 (0:00:00.035) 0:00:29.928 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:28:25 +0000 (0:00:00.402) 0:00:30.331 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:28:26 +0000 (0:00:00.665) 0:00:30.997 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:28:26 +0000 (0:00:00.032) 0:00:31.029 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:28:27 +0000 (0:00:00.698) 0:00:31.727 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:28:27 +0000 (0:00:00.398) 0:00:32.126 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:28:27 +0000 (0:00:00.032) 0:00:32.158 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:58 Wednesday 01 June 2022 17:28:28 +0000 (0:00:00.912) 0:00:33.071 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:28:28 +0000 (0:00:00.054) 0:00:33.125 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:28:28 +0000 (0:00:00.036) 0:00:33.162 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:28:28 +0000 (0:00:00.031) 0:00:33.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:28:28 +0000 (0:00:00.440) 0:00:33.634 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003427", "end": "2022-06-01 13:28:28.702900", "rc": 0, "start": "2022-06-01 13:28:28.699473" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:28:29 +0000 (0:00:00.388) 0:00:34.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003738", "end": "2022-06-01 13:28:29.093823", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:28:29.090085" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:28:29 +0000 (0:00:00.389) 0:00:34.412 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:28:29 +0000 (0:00:00.060) 0:00:34.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:28:29 +0000 (0:00:00.031) 0:00:34.505 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:28:29 +0000 (0:00:00.060) 0:00:34.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:28:29 +0000 (0:00:00.043) 0:00:34.609 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:28:29 +0000 (0:00:00.028) 0:00:34.637 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.029) 0:00:34.666 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.038) 0:00:34.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.035) 0:00:34.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.036) 0:00:34.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.035) 0:00:34.812 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.030) 0:00:34.843 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.057) 0:00:34.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.030) 0:00:34.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.032) 0:00:34.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.034) 0:00:34.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.031) 0:00:35.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.029) 0:00:35.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.031) 0:00:35.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.029) 0:00:35.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.029) 0:00:35.149 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.058) 0:00:35.208 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.028) 0:00:35.236 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.061) 0:00:35.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.037) 0:00:35.335 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.029) 0:00:35.365 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.027) 0:00:35.392 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.030) 0:00:35.423 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.070) 0:00:35.494 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.030) 0:00:35.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.032) 0:00:35.557 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.029) 0:00:35.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:28:30 +0000 (0:00:00.031) 0:00:35.618 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.029) 0:00:35.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Try to create ext4 filesystem with invalid parameter "-Fb 512"] ********** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:62 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.035) 0:00:35.683 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.050) 0:00:35.734 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.044) 0:00:35.778 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.525) 0:00:36.303 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.072) 0:00:36.376 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.036) 0:00:36.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.031) 0:00:36.444 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.062) 0:00:36.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.026) 0:00:36.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.031) 0:00:36.566 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_create_options": "-Fb 512", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.038) 0:00:36.604 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:31 +0000 (0:00:00.032) 0:00:36.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:32 +0000 (0:00:00.031) 0:00:36.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:32 +0000 (0:00:00.033) 0:00:36.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:32 +0000 (0:00:00.031) 0:00:36.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:32 +0000 (0:00:00.029) 0:00:36.763 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:32 +0000 (0:00:00.046) 0:00:36.809 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:32 +0000 (0:00:00.028) 0:00:36.838 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "xfsprogs", "e2fsprogs", "lvm2", "dosfstools" ], "pools": [], "volumes": [] } MSG: Failed to commit changes to disk: (FSError('format failed: 1'), '/dev/mapper/foo-test1') TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:28:33 +0000 (0:00:01.778) 0:00:38.617 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'_raw_device': u'/dev/mapper/foo-test1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/mapper/foo-test1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/foo-test1', u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u'-Fb 512'}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [u'xfsprogs', u'e2fsprogs', u'lvm2', u'dosfstools'], u'msg': u"Failed to commit changes to disk: (FSError('format failed: 1'), '/dev/mapper/foo-test1')"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.041) 0:00:38.658 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:82 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.026) 0:00:38.685 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output when creating ext4 filesystem with invalid parameter "-Fb 512"] *** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:88 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.035) 0:00:38.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the volume group created above] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:95 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.035) 0:00:38.756 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.041) 0:00:38.797 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.045) 0:00:38.843 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.661) 0:00:39.504 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.077) 0:00:39.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:28:34 +0000 (0:00:00.033) 0:00:39.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.033) 0:00:39.649 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.063) 0:00:39.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.027) 0:00:39.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.071) 0:00:39.811 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.037) 0:00:39.849 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.034) 0:00:39.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.039) 0:00:39.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.031) 0:00:39.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.031) 0:00:39.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.034) 0:00:40.021 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.043) 0:00:40.064 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:35 +0000 (0:00:00.028) 0:00:40.093 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:28:37 +0000 (0:00:01.771) 0:00:41.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.029) 0:00:41.894 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.026) 0:00:41.920 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.034) 0:00:41.955 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.040) 0:00:41.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.035) 0:00:42.030 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.028) 0:00:42.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.031) 0:00:42.090 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.029) 0:00:42.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.031) 0:00:42.151 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.416) 0:00:42.568 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:28:37 +0000 (0:00:00.030) 0:00:42.599 ******** ok: [/cache/rhel-x.qcow2] TASK [Create one LVM logical volume with "4g" under one volume group] ********** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:105 Wednesday 01 June 2022 17:28:38 +0000 (0:00:00.897) 0:00:43.496 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:28:38 +0000 (0:00:00.053) 0:00:43.549 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:28:38 +0000 (0:00:00.047) 0:00:43.597 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.559) 0:00:44.156 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.073) 0:00:44.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.031) 0:00:44.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.032) 0:00:44.293 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.073) 0:00:44.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.028) 0:00:44.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.032) 0:00:44.427 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.107) 0:00:44.534 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.041) 0:00:44.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:39 +0000 (0:00:00.033) 0:00:44.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:40 +0000 (0:00:00.030) 0:00:44.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:40 +0000 (0:00:00.031) 0:00:44.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:40 +0000 (0:00:00.031) 0:00:44.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:40 +0000 (0:00:00.049) 0:00:44.752 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:40 +0000 (0:00:00.030) 0:00:44.783 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:28:41 +0000 (0:00:01.783) 0:00:46.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:41 +0000 (0:00:00.031) 0:00:46.598 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:28:41 +0000 (0:00:00.030) 0:00:46.628 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:28:42 +0000 (0:00:00.041) 0:00:46.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:28:42 +0000 (0:00:00.036) 0:00:46.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:28:42 +0000 (0:00:00.038) 0:00:46.745 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:28:42 +0000 (0:00:00.031) 0:00:46.776 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:28:42 +0000 (0:00:00.723) 0:00:47.500 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:28:43 +0000 (0:00:00.438) 0:00:47.939 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:28:43 +0000 (0:00:00.677) 0:00:48.617 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:28:44 +0000 (0:00:00.397) 0:00:49.015 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:28:44 +0000 (0:00:00.030) 0:00:49.045 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:119 Wednesday 01 June 2022 17:28:45 +0000 (0:00:00.911) 0:00:49.956 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:28:45 +0000 (0:00:00.051) 0:00:50.007 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:28:45 +0000 (0:00:00.041) 0:00:50.049 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:28:45 +0000 (0:00:00.034) 0:00:50.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "7b687690-3d0f-4f3d-b232-3d863dbd83f5" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "A9g7Ta-lFPw-F8AZ-Rhfk-ImXc-X2uN-w7pNiH" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:28:45 +0000 (0:00:00.398) 0:00:50.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003387", "end": "2022-06-01 13:28:45.573142", "rc": 0, "start": "2022-06-01 13:28:45.569755" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:28:46 +0000 (0:00:00.426) 0:00:50.908 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003808", "end": "2022-06-01 13:28:45.995469", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:28:45.991661" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:28:46 +0000 (0:00:00.459) 0:00:51.368 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:28:46 +0000 (0:00:00.068) 0:00:51.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:28:46 +0000 (0:00:00.034) 0:00:51.471 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:28:46 +0000 (0:00:00.066) 0:00:51.538 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:28:46 +0000 (0:00:00.044) 0:00:51.582 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.416) 0:00:51.999 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.047) 0:00:52.046 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.039) 0:00:52.086 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.037) 0:00:52.123 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.037) 0:00:52.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.033) 0:00:52.195 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.045) 0:00:52.240 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.059) 0:00:52.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.030) 0:00:52.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.034) 0:00:52.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.031) 0:00:52.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.032) 0:00:52.429 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.032) 0:00:52.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.030) 0:00:52.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.035) 0:00:52.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.041) 0:00:52.569 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:28:47 +0000 (0:00:00.061) 0:00:52.631 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.059) 0:00:52.691 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.033) 0:00:52.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.030) 0:00:52.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.031) 0:00:52.786 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.062) 0:00:52.849 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.038) 0:00:52.887 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.041) 0:00:52.929 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.059) 0:00:52.988 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.037) 0:00:53.025 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.038) 0:00:53.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.034) 0:00:53.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.032) 0:00:53.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.031) 0:00:53.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.032) 0:00:53.194 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.033) 0:00:53.228 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.083) 0:00:53.311 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.070) 0:00:53.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.033) 0:00:53.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.032) 0:00:53.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.033) 0:00:53.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.037) 0:00:53.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.033) 0:00:53.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.031) 0:00:53.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:28:48 +0000 (0:00:00.033) 0:00:53.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.080) 0:00:53.697 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.033) 0:00:53.731 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.062) 0:00:53.794 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.038) 0:00:53.832 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.121) 0:00:53.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.037) 0:00:53.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "7b687690-3d0f-4f3d-b232-3d863dbd83f5" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "7b687690-3d0f-4f3d-b232-3d863dbd83f5" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.041) 0:00:54.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.039) 0:00:54.072 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.037) 0:00:54.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.038) 0:00:54.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.031) 0:00:54.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.032) 0:00:54.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.029) 0:00:54.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.030) 0:00:54.271 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.049) 0:00:54.320 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.033) 0:00:54.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.035) 0:00:54.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.032) 0:00:54.421 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.031) 0:00:54.453 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.038) 0:00:54.491 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:28:49 +0000 (0:00:00.038) 0:00:54.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104521.1761215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104521.1761215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 19880, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104521.1761215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.393) 0:00:54.924 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.039) 0:00:54.963 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.036) 0:00:55.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.036) 0:00:55.037 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.032) 0:00:55.069 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.036) 0:00:55.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.042) 0:00:55.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.034) 0:00:55.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.033) 0:00:55.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.044) 0:00:55.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.033) 0:00:55.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.032) 0:00:55.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.032) 0:00:55.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.032) 0:00:55.393 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.032) 0:00:55.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.042) 0:00:55.469 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.038) 0:00:55.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.031) 0:00:55.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.032) 0:00:55.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.031) 0:00:55.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:28:50 +0000 (0:00:00.030) 0:00:55.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.033) 0:00:55.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.032) 0:00:55.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.030) 0:00:55.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.029) 0:00:55.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.030) 0:00:55.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.031) 0:00:55.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.033) 0:00:55.856 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:28:51 +0000 (0:00:00.472) 0:00:56.329 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.404) 0:00:56.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.037) 0:00:56.771 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.037) 0:00:56.808 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.031) 0:00:56.840 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.033) 0:00:56.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.032) 0:00:56.906 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.034) 0:00:56.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.035) 0:00:56.976 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.037) 0:00:57.013 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.035) 0:00:57.049 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.041) 0:00:57.090 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.038607", "end": "2022-06-01 13:28:52.220019", "rc": 0, "start": "2022-06-01 13:28:52.181412" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.455) 0:00:57.545 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.040) 0:00:57.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:28:52 +0000 (0:00:00.042) 0:00:57.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.038) 0:00:57.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.042) 0:00:57.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.036) 0:00:57.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.035) 0:00:57.781 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.032) 0:00:57.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.036) 0:00:57.851 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.032) 0:00:57.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Resizing with one large value which large than disk's size] ************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:123 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.032) 0:00:57.916 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.050) 0:00:57.966 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.047) 0:00:58.014 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.517) 0:00:58.532 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:28:53 +0000 (0:00:00.074) 0:00:58.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.038) 0:00:58.645 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.033) 0:00:58.679 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.063) 0:00:58.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.026) 0:00:58.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.032) 0:00:58.801 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "12884901888.0" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.042) 0:00:58.843 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.036) 0:00:58.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.030) 0:00:58.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.031) 0:00:58.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.033) 0:00:58.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.036) 0:00:59.011 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.048) 0:00:59.059 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:54 +0000 (0:00:00.029) 0:00:59.089 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: volume 'test1' cannot be resized to '12 GiB' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:28:55 +0000 (0:00:01.373) 0:01:00.463 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext4', u'mount_options': u'defaults', u'size': u'12884901888.0', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"volume 'test1' cannot be resized to '12 GiB'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:55 +0000 (0:00:00.041) 0:01:00.504 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:142 Wednesday 01 June 2022 17:28:55 +0000 (0:00:00.027) 0:01:00.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output when resizing with large size] ************************* task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:148 Wednesday 01 June 2022 17:28:55 +0000 (0:00:00.035) 0:01:00.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the volume group created above] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:155 Wednesday 01 June 2022 17:28:55 +0000 (0:00:00.035) 0:01:00.603 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.043) 0:01:00.647 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.043) 0:01:00.691 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.570) 0:01:01.261 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.074) 0:01:01.336 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.030) 0:01:01.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.030) 0:01:01.397 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.063) 0:01:01.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.028) 0:01:01.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.034) 0:01:01.524 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.047) 0:01:01.572 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:28:56 +0000 (0:00:00.045) 0:01:01.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:28:57 +0000 (0:00:00.031) 0:01:01.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:28:57 +0000 (0:00:00.030) 0:01:01.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:28:57 +0000 (0:00:00.030) 0:01:01.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:28:57 +0000 (0:00:00.032) 0:01:01.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:28:57 +0000 (0:00:00.046) 0:01:01.789 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:28:57 +0000 (0:00:00.034) 0:01:01.824 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:28:59 +0000 (0:00:01.848) 0:01:03.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:28:59 +0000 (0:00:00.032) 0:01:03.705 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:28:59 +0000 (0:00:00.031) 0:01:03.737 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:28:59 +0000 (0:00:00.039) 0:01:03.776 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:28:59 +0000 (0:00:00.035) 0:01:03.811 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:28:59 +0000 (0:00:00.036) 0:01:03.848 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:28:59 +0000 (0:00:00.404) 0:01:04.252 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:29:00 +0000 (0:00:00.694) 0:01:04.947 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:29:00 +0000 (0:00:00.031) 0:01:04.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:29:01 +0000 (0:00:00.697) 0:01:05.675 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:29:01 +0000 (0:00:00.410) 0:01:06.086 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:29:01 +0000 (0:00:00.030) 0:01:06.117 ******** ok: [/cache/rhel-x.qcow2] TASK [Create one partition on one disk] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:165 Wednesday 01 June 2022 17:29:02 +0000 (0:00:00.971) 0:01:07.088 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:29:02 +0000 (0:00:00.053) 0:01:07.142 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:29:02 +0000 (0:00:00.049) 0:01:07.191 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.554) 0:01:07.745 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.072) 0:01:07.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.032) 0:01:07.851 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.031) 0:01:07.882 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.080) 0:01:07.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.029) 0:01:07.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.036) 0:01:08.028 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.041) 0:01:08.070 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.038) 0:01:08.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.034) 0:01:08.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.031) 0:01:08.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.032) 0:01:08.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.032) 0:01:08.239 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.046) 0:01:08.286 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:29:03 +0000 (0:00:00.031) 0:01:08.318 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:29:05 +0000 (0:00:01.797) 0:01:10.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:29:05 +0000 (0:00:00.032) 0:01:10.148 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:29:05 +0000 (0:00:00.029) 0:01:10.178 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:29:05 +0000 (0:00:00.040) 0:01:10.218 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:29:05 +0000 (0:00:00.040) 0:01:10.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:29:05 +0000 (0:00:00.037) 0:01:10.296 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:29:05 +0000 (0:00:00.028) 0:01:10.325 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:29:06 +0000 (0:00:00.661) 0:01:10.986 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:29:06 +0000 (0:00:00.440) 0:01:11.427 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:29:07 +0000 (0:00:00.725) 0:01:12.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:29:07 +0000 (0:00:00.425) 0:01:12.577 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:29:07 +0000 (0:00:00.031) 0:01:12.608 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:179 Wednesday 01 June 2022 17:29:08 +0000 (0:00:00.942) 0:01:13.550 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:29:08 +0000 (0:00:00.051) 0:01:13.602 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:29:09 +0000 (0:00:00.041) 0:01:13.644 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:29:09 +0000 (0:00:00.030) 0:01:13.675 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "c0a1d4e7-dc00-42bb-8558-3f98886dcf7e" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:29:09 +0000 (0:00:00.412) 0:01:14.087 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003305", "end": "2022-06-01 13:29:09.167437", "rc": 0, "start": "2022-06-01 13:29:09.164132" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:29:09 +0000 (0:00:00.402) 0:01:14.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003050", "end": "2022-06-01 13:29:09.560420", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:29:09.557370" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.390) 0:01:14.880 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.070) 0:01:14.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.033) 0:01:14.984 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.062) 0:01:15.047 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.032) 0:01:15.080 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.028) 0:01:15.108 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.031) 0:01:15.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.030) 0:01:15.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.030) 0:01:15.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.029) 0:01:15.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.033) 0:01:15.264 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.028) 0:01:15.292 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.056) 0:01:15.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.030) 0:01:15.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.031) 0:01:15.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.033) 0:01:15.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.033) 0:01:15.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.033) 0:01:15.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.033) 0:01:15.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.031) 0:01:15.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:29:10 +0000 (0:00:00.033) 0:01:15.608 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.060) 0:01:15.669 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.043) 0:01:15.712 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.109) 0:01:15.822 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.038) 0:01:15.861 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.030) 0:01:15.892 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.028) 0:01:15.921 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.032) 0:01:15.953 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.066) 0:01:16.020 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.040) 0:01:16.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.031) 0:01:16.092 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.060) 0:01:16.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.035) 0:01:16.187 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.141) 0:01:16.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.036) 0:01:16.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "c0a1d4e7-dc00-42bb-8558-3f98886dcf7e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "c0a1d4e7-dc00-42bb-8558-3f98886dcf7e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.044) 0:01:16.410 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.036) 0:01:16.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.039) 0:01:16.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.038) 0:01:16.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.032) 0:01:16.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.031) 0:01:16.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:29:11 +0000 (0:00:00.033) 0:01:16.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.034) 0:01:16.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.050) 0:01:16.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.036) 0:01:16.744 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.043) 0:01:16.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.037) 0:01:16.825 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.035) 0:01:16.860 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.040) 0:01:16.901 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.042) 0:01:16.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104544.7301216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104544.7301216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 20051, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654104544.7301216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.416) 0:01:17.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.038) 0:01:17.399 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.040) 0:01:17.439 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.035) 0:01:17.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.033) 0:01:17.508 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.040) 0:01:17.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.038) 0:01:17.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:29:12 +0000 (0:00:00.032) 0:01:17.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.045) 0:01:17.665 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.040) 0:01:17.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.034) 0:01:17.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.031) 0:01:17.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.035) 0:01:17.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.032) 0:01:17.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.031) 0:01:17.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.036) 0:01:17.909 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.033) 0:01:17.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.028) 0:01:17.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.030) 0:01:18.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.030) 0:01:18.032 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.034) 0:01:18.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.033) 0:01:18.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.033) 0:01:18.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.032) 0:01:18.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.032) 0:01:18.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.083) 0:01:18.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.033) 0:01:18.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.031) 0:01:18.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.032) 0:01:18.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.032) 0:01:18.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.031) 0:01:18.444 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.038) 0:01:18.483 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.032) 0:01:18.515 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.032) 0:01:18.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.033) 0:01:18.581 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:29:13 +0000 (0:00:00.036) 0:01:18.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.040) 0:01:18.659 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.042) 0:01:18.701 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.035) 0:01:18.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.032) 0:01:18.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.033) 0:01:18.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.032) 0:01:18.836 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.032) 0:01:18.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.032) 0:01:18.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.032) 0:01:18.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.033) 0:01:18.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.031) 0:01:19.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.032) 0:01:19.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.032) 0:01:19.065 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.030) 0:01:19.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Test setting up disk volume will remove the partition create above] ****** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:181 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.031) 0:01:19.127 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.062) 0:01:19.189 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:29:14 +0000 (0:00:00.046) 0:01:19.236 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.543) 0:01:19.780 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.074) 0:01:19.854 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.033) 0:01:19.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.034) 0:01:19.923 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.064) 0:01:19.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.027) 0:01:20.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.035) 0:01:20.050 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.035) 0:01:20.085 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_create_options": "-F", "fs_type": "ext4", "mount_options": "rw,noatime,defaults", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.044) 0:01:20.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.033) 0:01:20.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.032) 0:01:20.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.031) 0:01:20.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.035) 0:01:20.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.049) 0:01:20.312 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:29:15 +0000 (0:00:00.031) 0:01:20.343 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "rw,noatime,defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:29:17 +0000 (0:00:01.765) 0:01:22.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:29:17 +0000 (0:00:00.031) 0:01:22.140 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:29:17 +0000 (0:00:00.030) 0:01:22.170 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "rw,noatime,defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:29:17 +0000 (0:00:00.038) 0:01:22.209 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:29:17 +0000 (0:00:00.084) 0:01:22.294 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:29:17 +0000 (0:00:00.039) 0:01:22.333 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=c0a1d4e7-dc00-42bb-8558-3f98886dcf7e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:29:18 +0000 (0:00:00.425) 0:01:22.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:29:18 +0000 (0:00:00.707) 0:01:23.466 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=c0415c33-0b15-4152-961b-3effb2160745', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'rw,noatime,defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "rw,noatime,defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "state": "mounted" }, "name": "/opt/test1", "opts": "rw,noatime,defaults", "passno": "0", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:29:19 +0000 (0:00:00.426) 0:01:23.893 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:29:19 +0000 (0:00:00.672) 0:01:24.565 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:29:20 +0000 (0:00:00.404) 0:01:24.969 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:29:20 +0000 (0:00:00.029) 0:01:24.999 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:194 Wednesday 01 June 2022 17:29:21 +0000 (0:00:00.893) 0:01:25.892 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:29:21 +0000 (0:00:00.066) 0:01:25.959 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:29:21 +0000 (0:00:00.032) 0:01:25.992 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:29:21 +0000 (0:00:00.041) 0:01:26.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "c0415c33-0b15-4152-961b-3effb2160745" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:29:21 +0000 (0:00:00.415) 0:01:26.449 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003392", "end": "2022-06-01 13:29:21.536925", "rc": 0, "start": "2022-06-01 13:29:21.533533" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=c0415c33-0b15-4152-961b-3effb2160745 /opt/test1 ext4 rw,noatime,defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.408) 0:01:26.858 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003487", "end": "2022-06-01 13:29:21.940738", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:29:21.937251" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.412) 0:01:27.270 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.029) 0:01:27.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.034) 0:01:27.334 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.064) 0:01:27.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.036) 0:01:27.436 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.120) 0:01:27.556 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:29:22 +0000 (0:00:00.039) 0:01:27.595 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,noatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "c0415c33-0b15-4152-961b-3effb2160745" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,noatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "c0415c33-0b15-4152-961b-3effb2160745" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.043) 0:01:27.639 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.091) 0:01:27.731 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.039) 0:01:27.770 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.040) 0:01:27.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.032) 0:01:27.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.032) 0:01:27.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.031) 0:01:27.906 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.035) 0:01:27.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=c0415c33-0b15-4152-961b-3effb2160745 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 rw,noatime,defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.049) 0:01:27.991 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.035) 0:01:28.026 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.039) 0:01:28.065 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.044) 0:01:28.110 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.031) 0:01:28.141 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.039) 0:01:28.180 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:29:23 +0000 (0:00:00.039) 0:01:28.220 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104556.7241216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104556.7241216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654104556.7241216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.424) 0:01:28.645 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.039) 0:01:28.684 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.042) 0:01:28.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.037) 0:01:28.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.033) 0:01:28.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.046) 0:01:28.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.039) 0:01:28.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.034) 0:01:28.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.033) 0:01:28.951 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.040) 0:01:28.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.030) 0:01:29.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.033) 0:01:29.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.032) 0:01:29.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.031) 0:01:29.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.032) 0:01:29.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.039) 0:01:29.191 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.037) 0:01:29.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.033) 0:01:29.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.031) 0:01:29.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.031) 0:01:29.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.032) 0:01:29.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.033) 0:01:29.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.033) 0:01:29.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.036) 0:01:29.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.033) 0:01:29.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.031) 0:01:29.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.031) 0:01:29.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.030) 0:01:29.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:29:24 +0000 (0:00:00.029) 0:01:29.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.033) 0:01:29.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.033) 0:01:29.685 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.036) 0:01:29.722 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.033) 0:01:29.755 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.031) 0:01:29.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.032) 0:01:29.820 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.049) 0:01:29.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.032) 0:01:29.902 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.035) 0:01:29.938 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.034) 0:01:29.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.031) 0:01:30.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.031) 0:01:30.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.032) 0:01:30.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.032) 0:01:30.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.033) 0:01:30.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.033) 0:01:30.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.032) 0:01:30.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.031) 0:01:30.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.036) 0:01:30.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the disk volume created above] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:198 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.082) 0:01:30.350 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.070) 0:01:30.421 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:29:25 +0000 (0:00:00.049) 0:01:30.470 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.559) 0:01:31.030 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.075) 0:01:31.105 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.034) 0:01:31.139 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.034) 0:01:31.174 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.063) 0:01:31.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.027) 0:01:31.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.033) 0:01:31.299 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.037) 0:01:31.337 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.034) 0:01:31.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.028) 0:01:31.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.031) 0:01:31.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.031) 0:01:31.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.033) 0:01:31.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.050) 0:01:31.546 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:29:26 +0000 (0:00:00.028) 0:01:31.575 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:29:28 +0000 (0:00:01.417) 0:01:32.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:29:28 +0000 (0:00:00.031) 0:01:33.023 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:29:28 +0000 (0:00:00.031) 0:01:33.054 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:29:28 +0000 (0:00:00.039) 0:01:33.094 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:29:28 +0000 (0:00:00.038) 0:01:33.132 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:29:28 +0000 (0:00:00.036) 0:01:33.169 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=c0415c33-0b15-4152-961b-3effb2160745', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=c0415c33-0b15-4152-961b-3effb2160745" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:29:28 +0000 (0:00:00.412) 0:01:33.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:29:29 +0000 (0:00:00.680) 0:01:34.262 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:29:29 +0000 (0:00:00.031) 0:01:34.294 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:29:30 +0000 (0:00:00.674) 0:01:34.969 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:29:30 +0000 (0:00:00.398) 0:01:35.367 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:29:30 +0000 (0:00:00.030) 0:01:35.397 ******** ok: [/cache/rhel-x.qcow2] TASK [Try to mount swap filesystem to "/opt/test1"] **************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:210 Wednesday 01 June 2022 17:29:31 +0000 (0:00:00.880) 0:01:36.278 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:29:31 +0000 (0:00:00.053) 0:01:36.331 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:29:31 +0000 (0:00:00.047) 0:01:36.378 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.543) 0:01:36.922 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.077) 0:01:36.999 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.034) 0:01:37.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.032) 0:01:37.066 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.067) 0:01:37.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.026) 0:01:37.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.035) 0:01:37.196 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.036) 0:01:37.233 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "swap", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.040) 0:01:37.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.032) 0:01:37.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.035) 0:01:37.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.033) 0:01:37.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.033) 0:01:37.410 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.047) 0:01:37.457 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:29:32 +0000 (0:00:00.028) 0:01:37.486 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: volume 'test1' has a mount point but no mountable file system TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:29:33 +0000 (0:00:01.079) 0:01:38.565 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'swap', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"volume 'test1' has a mount point but no mountable file system"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:29:33 +0000 (0:00:00.046) 0:01:38.612 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:226 Wednesday 01 June 2022 17:29:34 +0000 (0:00:00.029) 0:01:38.642 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output when mount swap filesystem to "/opt/test1"] ************ task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:232 Wednesday 01 June 2022 17:29:34 +0000 (0:00:00.042) 0:01:38.684 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=472 changed=16 unreachable=0 failed=3 skipped=357 rescued=3 ignored=0 Wednesday 01 June 2022 17:29:34 +0000 (0:00:00.022) 0:01:38.706 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.80s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.42s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.39s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.14s /tmp/tmp7247_7fr/tests/tests_misc.yml:2 --------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.12s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Update facts ------------------------------- 0.97s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.96s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:29:34 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:29:36 +0000 (0:00:01.328) 0:00:01.351 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_misc_nvme_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_misc_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:29:36 +0000 (0:00:00.032) 0:00:01.383 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:29:36 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:29:38 +0000 (0:00:01.317) 0:00:01.340 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_misc_scsi_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_misc_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_misc_scsi_generated.yml:3 Wednesday 01 June 2022 17:29:38 +0000 (0:00:00.027) 0:00:01.367 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_misc_scsi_generated.yml:7 Wednesday 01 June 2022 17:29:39 +0000 (0:00:01.138) 0:00:02.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:2 Wednesday 01 June 2022 17:29:39 +0000 (0:00:00.027) 0:00:02.533 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:16 Wednesday 01 June 2022 17:29:40 +0000 (0:00:00.863) 0:00:03.396 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:29:40 +0000 (0:00:00.036) 0:00:03.433 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:29:40 +0000 (0:00:00.156) 0:00:03.589 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:29:41 +0000 (0:00:00.529) 0:00:04.118 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:29:41 +0000 (0:00:00.076) 0:00:04.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:29:41 +0000 (0:00:00.023) 0:00:04.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:29:41 +0000 (0:00:00.026) 0:00:04.245 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:29:41 +0000 (0:00:00.196) 0:00:04.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:29:41 +0000 (0:00:00.019) 0:00:04.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:29:42 +0000 (0:00:01.083) 0:00:05.545 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:29:42 +0000 (0:00:00.055) 0:00:05.600 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:29:42 +0000 (0:00:00.052) 0:00:05.652 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:29:43 +0000 (0:00:00.704) 0:00:06.357 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:29:43 +0000 (0:00:00.077) 0:00:06.434 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:29:43 +0000 (0:00:00.018) 0:00:06.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:29:43 +0000 (0:00:00.019) 0:00:06.472 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:29:43 +0000 (0:00:00.018) 0:00:06.490 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:29:44 +0000 (0:00:00.833) 0:00:07.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:29:46 +0000 (0:00:02.104) 0:00:09.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:29:46 +0000 (0:00:00.044) 0:00:09.473 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:29:46 +0000 (0:00:00.061) 0:00:09.534 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.569) 0:00:10.103 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.030) 0:00:10.134 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.026) 0:00:10.160 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.032) 0:00:10.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.037) 0:00:10.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.034) 0:00:10.264 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.028) 0:00:10.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.028) 0:00:10.322 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.025) 0:00:10.348 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.027) 0:00:10.375 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.504) 0:00:10.880 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:29:47 +0000 (0:00:00.030) 0:00:10.910 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:19 Wednesday 01 June 2022 17:29:48 +0000 (0:00:00.868) 0:00:11.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:26 Wednesday 01 June 2022 17:29:48 +0000 (0:00:00.029) 0:00:11.808 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:29:48 +0000 (0:00:00.045) 0:00:11.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:29:49 +0000 (0:00:00.532) 0:00:12.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:29:49 +0000 (0:00:00.035) 0:00:12.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:29:49 +0000 (0:00:00.030) 0:00:12.453 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Test creating ext4 filesystem with valid parameter "-Fb 4096"] *********** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:31 Wednesday 01 June 2022 17:29:49 +0000 (0:00:00.032) 0:00:12.485 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:29:49 +0000 (0:00:00.052) 0:00:12.538 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:29:49 +0000 (0:00:00.040) 0:00:12.578 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.515) 0:00:13.094 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.068) 0:00:13.162 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.028) 0:00:13.191 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.028) 0:00:13.220 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.063) 0:00:13.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.066) 0:00:13.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.030) 0:00:13.380 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_create_options": "-Fb 4096", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.036) 0:00:13.417 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.033) 0:00:13.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.031) 0:00:13.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.030) 0:00:13.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.033) 0:00:13.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.031) 0:00:13.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.045) 0:00:13.624 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:29:50 +0000 (0:00:00.030) 0:00:13.654 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:29:52 +0000 (0:00:01.751) 0:00:15.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:29:52 +0000 (0:00:00.033) 0:00:15.439 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:29:52 +0000 (0:00:00.029) 0:00:15.469 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:29:52 +0000 (0:00:00.046) 0:00:15.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:29:52 +0000 (0:00:00.040) 0:00:15.555 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:29:52 +0000 (0:00:00.037) 0:00:15.594 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:29:52 +0000 (0:00:00.033) 0:00:15.627 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:29:53 +0000 (0:00:00.997) 0:00:16.625 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:29:54 +0000 (0:00:00.548) 0:00:17.174 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:29:54 +0000 (0:00:00.652) 0:00:17.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:29:55 +0000 (0:00:00.399) 0:00:18.227 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:29:55 +0000 (0:00:00.031) 0:00:18.258 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:46 Wednesday 01 June 2022 17:29:56 +0000 (0:00:00.912) 0:00:19.170 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:29:56 +0000 (0:00:00.051) 0:00:19.222 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-Fb 4096", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:29:56 +0000 (0:00:00.038) 0:00:19.261 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:29:56 +0000 (0:00:00.030) 0:00:19.291 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "da60f49a-4c7a-42b7-a111-9f144acddc94" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "WGQATB-nY6u-3DvK-cdVu-6jgE-ockc-Sek1Bb" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:29:56 +0000 (0:00:00.546) 0:00:19.838 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003010", "end": "2022-06-01 13:29:56.624874", "rc": 0, "start": "2022-06-01 13:29:56.621864" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:29:57 +0000 (0:00:00.525) 0:00:20.363 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002923", "end": "2022-06-01 13:29:57.013792", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:29:57.010869" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:29:57 +0000 (0:00:00.388) 0:00:20.752 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:29:57 +0000 (0:00:00.104) 0:00:20.856 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:29:57 +0000 (0:00:00.030) 0:00:20.887 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:29:57 +0000 (0:00:00.069) 0:00:20.957 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:29:57 +0000 (0:00:00.042) 0:00:20.999 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.493) 0:00:21.493 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.041) 0:00:21.535 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.040) 0:00:21.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.044) 0:00:21.620 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.040) 0:00:21.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.034) 0:00:21.695 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.046) 0:00:21.741 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.060) 0:00:21.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.031) 0:00:21.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.033) 0:00:21.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.031) 0:00:21.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.031) 0:00:21.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.032) 0:00:21.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.031) 0:00:21.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:29:58 +0000 (0:00:00.031) 0:00:22.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.032) 0:00:22.059 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.057) 0:00:22.117 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.060) 0:00:22.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.031) 0:00:22.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.029) 0:00:22.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.031) 0:00:22.271 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.066) 0:00:22.337 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.035) 0:00:22.373 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.033) 0:00:22.407 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.056) 0:00:22.463 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.036) 0:00:22.500 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.036) 0:00:22.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.029) 0:00:22.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.030) 0:00:22.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.031) 0:00:22.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.032) 0:00:22.661 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.030) 0:00:22.692 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.064) 0:00:22.757 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.066) 0:00:22.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.031) 0:00:22.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.029) 0:00:22.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.031) 0:00:22.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.031) 0:00:22.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.032) 0:00:22.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:29:59 +0000 (0:00:00.037) 0:00:23.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.087) 0:00:23.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.031) 0:00:23.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.031) 0:00:23.169 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.060) 0:00:23.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.039) 0:00:23.269 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.131) 0:00:23.400 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.037) 0:00:23.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "da60f49a-4c7a-42b7-a111-9f144acddc94" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "da60f49a-4c7a-42b7-a111-9f144acddc94" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.042) 0:00:23.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.039) 0:00:23.520 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.041) 0:00:23.561 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.043) 0:00:23.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.033) 0:00:23.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.032) 0:00:23.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.034) 0:00:23.705 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.033) 0:00:23.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.049) 0:00:23.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.033) 0:00:23.822 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.037) 0:00:23.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.029) 0:00:23.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.030) 0:00:23.919 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.038) 0:00:23.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:30:00 +0000 (0:00:00.039) 0:00:23.997 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104591.6061215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104591.6061215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 20277, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104591.6061215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.406) 0:00:24.404 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.038) 0:00:24.442 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.037) 0:00:24.479 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.034) 0:00:24.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.029) 0:00:24.544 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.036) 0:00:24.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.033) 0:00:24.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.031) 0:00:24.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.031) 0:00:24.677 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.038) 0:00:24.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.031) 0:00:24.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.031) 0:00:24.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.033) 0:00:24.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.032) 0:00:24.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.033) 0:00:24.877 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.040) 0:00:24.917 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.035) 0:00:24.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.031) 0:00:24.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.035) 0:00:25.020 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:30:01 +0000 (0:00:00.032) 0:00:25.052 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.033) 0:00:25.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.031) 0:00:25.116 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.031) 0:00:25.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.033) 0:00:25.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.035) 0:00:25.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.032) 0:00:25.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.029) 0:00:25.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.029) 0:00:25.310 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:30:02 +0000 (0:00:00.557) 0:00:25.868 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.406) 0:00:26.275 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.041) 0:00:26.316 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.034) 0:00:26.350 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.031) 0:00:26.382 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.032) 0:00:26.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.031) 0:00:26.445 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.032) 0:00:26.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.037) 0:00:26.516 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.038) 0:00:26.555 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.033) 0:00:26.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:30:03 +0000 (0:00:00.040) 0:00:26.629 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.044129", "end": "2022-06-01 13:30:03.342143", "rc": 0, "start": "2022-06-01 13:30:03.298014" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.457) 0:00:27.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.040) 0:00:27.127 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.040) 0:00:27.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.035) 0:00:27.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.032) 0:00:27.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.032) 0:00:27.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.035) 0:00:27.305 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.032) 0:00:27.338 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.030) 0:00:27.369 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.031) 0:00:27.400 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the volume group created above] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:48 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.030) 0:00:27.431 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.062) 0:00:27.493 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:04 +0000 (0:00:00.047) 0:00:27.541 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.548) 0:00:28.089 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.073) 0:00:28.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.031) 0:00:28.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.033) 0:00:28.228 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.062) 0:00:28.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.026) 0:00:28.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.031) 0:00:28.350 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.037) 0:00:28.387 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.033) 0:00:28.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.030) 0:00:28.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.031) 0:00:28.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.032) 0:00:28.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.031) 0:00:28.546 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.091) 0:00:28.638 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:05 +0000 (0:00:00.029) 0:00:28.667 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:30:07 +0000 (0:00:01.985) 0:00:30.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:07 +0000 (0:00:00.032) 0:00:30.684 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:30:07 +0000 (0:00:00.028) 0:00:30.713 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:30:07 +0000 (0:00:00.036) 0:00:30.750 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:30:07 +0000 (0:00:00.038) 0:00:30.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:30:07 +0000 (0:00:00.037) 0:00:30.825 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:30:08 +0000 (0:00:00.425) 0:00:31.251 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:30:08 +0000 (0:00:00.697) 0:00:31.948 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:30:08 +0000 (0:00:00.031) 0:00:31.979 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:30:09 +0000 (0:00:00.686) 0:00:32.665 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:30:10 +0000 (0:00:00.403) 0:00:33.069 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:30:10 +0000 (0:00:00.030) 0:00:33.099 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:58 Wednesday 01 June 2022 17:30:10 +0000 (0:00:00.853) 0:00:33.952 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:30:10 +0000 (0:00:00.059) 0:00:34.012 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:30:10 +0000 (0:00:00.039) 0:00:34.052 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:30:11 +0000 (0:00:00.031) 0:00:34.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:30:11 +0000 (0:00:00.399) 0:00:34.484 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003254", "end": "2022-06-01 13:30:11.147734", "rc": 0, "start": "2022-06-01 13:30:11.144480" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:30:11 +0000 (0:00:00.402) 0:00:34.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002800", "end": "2022-06-01 13:30:11.537754", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:30:11.534954" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.390) 0:00:35.277 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.058) 0:00:35.336 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.032) 0:00:35.368 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.063) 0:00:35.431 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.040) 0:00:35.472 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.028) 0:00:35.501 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.028) 0:00:35.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.039) 0:00:35.568 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.093) 0:00:35.662 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.038) 0:00:35.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.033) 0:00:35.733 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.029) 0:00:35.762 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.059) 0:00:35.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.030) 0:00:35.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.032) 0:00:35.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.030) 0:00:35.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.033) 0:00:35.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.035) 0:00:35.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.034) 0:00:36.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:30:12 +0000 (0:00:00.033) 0:00:36.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.032) 0:00:36.085 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.057) 0:00:36.143 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.027) 0:00:36.170 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.063) 0:00:36.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.037) 0:00:36.271 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.027) 0:00:36.299 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.026) 0:00:36.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.031) 0:00:36.357 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.065) 0:00:36.422 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.028) 0:00:36.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.030) 0:00:36.481 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.030) 0:00:36.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.031) 0:00:36.544 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.041) 0:00:36.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Try to create ext4 filesystem with invalid parameter "-Fb 512"] ********** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:62 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.031) 0:00:36.617 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.048) 0:00:36.665 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:13 +0000 (0:00:00.045) 0:00:36.711 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.557) 0:00:37.269 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.072) 0:00:37.341 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.032) 0:00:37.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.031) 0:00:37.405 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.062) 0:00:37.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.024) 0:00:37.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.032) 0:00:37.524 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_create_options": "-Fb 512", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.036) 0:00:37.561 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.034) 0:00:37.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.032) 0:00:37.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.031) 0:00:37.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.029) 0:00:37.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.031) 0:00:37.721 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.046) 0:00:37.767 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:14 +0000 (0:00:00.029) 0:00:37.796 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [], "volumes": [] } MSG: Failed to commit changes to disk: (FSError('format failed: 1'), '/dev/mapper/foo-test1') TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:30:16 +0000 (0:00:01.727) 0:00:39.524 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'_raw_device': u'/dev/mapper/foo-test1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/mapper/foo-test1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/foo-test1', u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u'-Fb 512'}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [u'e2fsprogs', u'dosfstools', u'xfsprogs', u'lvm2'], u'msg': u"Failed to commit changes to disk: (FSError('format failed: 1'), '/dev/mapper/foo-test1')"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:16 +0000 (0:00:00.043) 0:00:39.567 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:82 Wednesday 01 June 2022 17:30:16 +0000 (0:00:00.028) 0:00:39.595 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output when creating ext4 filesystem with invalid parameter "-Fb 512"] *** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:88 Wednesday 01 June 2022 17:30:16 +0000 (0:00:00.044) 0:00:39.640 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the volume group created above] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:95 Wednesday 01 June 2022 17:30:16 +0000 (0:00:00.034) 0:00:39.675 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:16 +0000 (0:00:00.044) 0:00:39.719 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:16 +0000 (0:00:00.043) 0:00:39.762 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.535) 0:00:40.298 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.071) 0:00:40.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.031) 0:00:40.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.030) 0:00:40.431 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.064) 0:00:40.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.028) 0:00:40.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.033) 0:00:40.558 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.038) 0:00:40.596 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.033) 0:00:40.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.033) 0:00:40.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.033) 0:00:40.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.032) 0:00:40.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.031) 0:00:40.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.048) 0:00:40.809 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:17 +0000 (0:00:00.029) 0:00:40.838 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:30:19 +0000 (0:00:01.747) 0:00:42.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.029) 0:00:42.615 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.029) 0:00:42.644 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.038) 0:00:42.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.034) 0:00:42.717 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.033) 0:00:42.751 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.029) 0:00:42.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.030) 0:00:42.810 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.029) 0:00:42.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:30:19 +0000 (0:00:00.032) 0:00:42.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:30:20 +0000 (0:00:00.431) 0:00:43.303 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:30:20 +0000 (0:00:00.030) 0:00:43.334 ******** ok: [/cache/rhel-x.qcow2] TASK [Create one LVM logical volume with "4g" under one volume group] ********** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:105 Wednesday 01 June 2022 17:30:21 +0000 (0:00:00.885) 0:00:44.219 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:21 +0000 (0:00:00.091) 0:00:44.311 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:21 +0000 (0:00:00.044) 0:00:44.355 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:21 +0000 (0:00:00.522) 0:00:44.877 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:21 +0000 (0:00:00.069) 0:00:44.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:21 +0000 (0:00:00.029) 0:00:44.977 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:21 +0000 (0:00:00.030) 0:00:45.007 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.060) 0:00:45.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.027) 0:00:45.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.036) 0:00:45.133 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.036) 0:00:45.169 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.034) 0:00:45.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.030) 0:00:45.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.030) 0:00:45.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.033) 0:00:45.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.030) 0:00:45.328 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.042) 0:00:45.370 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:22 +0000 (0:00:00.026) 0:00:45.397 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:30:24 +0000 (0:00:01.769) 0:00:47.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:24 +0000 (0:00:00.031) 0:00:47.198 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:30:24 +0000 (0:00:00.029) 0:00:47.227 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:30:24 +0000 (0:00:00.041) 0:00:47.269 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:30:24 +0000 (0:00:00.038) 0:00:47.307 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:30:24 +0000 (0:00:00.036) 0:00:47.343 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:30:24 +0000 (0:00:00.030) 0:00:47.373 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:30:24 +0000 (0:00:00.680) 0:00:48.054 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:30:25 +0000 (0:00:00.431) 0:00:48.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:30:26 +0000 (0:00:00.675) 0:00:49.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:30:26 +0000 (0:00:00.407) 0:00:49.569 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:30:26 +0000 (0:00:00.029) 0:00:49.599 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:119 Wednesday 01 June 2022 17:30:27 +0000 (0:00:00.950) 0:00:50.549 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:30:27 +0000 (0:00:00.052) 0:00:50.602 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:30:27 +0000 (0:00:00.043) 0:00:50.645 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:30:27 +0000 (0:00:00.034) 0:00:50.680 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "d0aab55c-5f72-4eb0-91ba-bd86d15efd9c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ipCoyA-w0pM-ZOF4-3txL-AdRy-r1bn-Y2KQ0f" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:30:28 +0000 (0:00:00.425) 0:00:51.106 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003070", "end": "2022-06-01 13:30:27.753955", "rc": 0, "start": "2022-06-01 13:30:27.750885" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:30:28 +0000 (0:00:00.385) 0:00:51.492 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003145", "end": "2022-06-01 13:30:28.160773", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:30:28.157628" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:30:28 +0000 (0:00:00.410) 0:00:51.902 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:30:28 +0000 (0:00:00.065) 0:00:51.968 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:30:28 +0000 (0:00:00.036) 0:00:52.004 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.073) 0:00:52.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.046) 0:00:52.123 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.389) 0:00:52.512 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/sda" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.044) 0:00:52.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.039) 0:00:52.596 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.038) 0:00:52.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.036) 0:00:52.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.030) 0:00:52.701 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.044) 0:00:52.746 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.059) 0:00:52.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.032) 0:00:52.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.031) 0:00:52.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.033) 0:00:52.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.032) 0:00:52.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.032) 0:00:52.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.036) 0:00:53.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:30:29 +0000 (0:00:00.034) 0:00:53.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.033) 0:00:53.073 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.060) 0:00:53.133 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.112) 0:00:53.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.033) 0:00:53.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.031) 0:00:53.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.029) 0:00:53.341 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.065) 0:00:53.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.036) 0:00:53.443 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.036) 0:00:53.479 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.058) 0:00:53.538 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.037) 0:00:53.575 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.038) 0:00:53.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.033) 0:00:53.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.032) 0:00:53.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.031) 0:00:53.711 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.032) 0:00:53.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.034) 0:00:53.778 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.065) 0:00:53.843 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.072) 0:00:53.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.033) 0:00:53.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.033) 0:00:53.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:30:30 +0000 (0:00:00.034) 0:00:54.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.041) 0:00:54.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.036) 0:00:54.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.037) 0:00:54.132 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.033) 0:00:54.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.034) 0:00:54.200 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.033) 0:00:54.234 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.063) 0:00:54.297 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.038) 0:00:54.335 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.129) 0:00:54.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.040) 0:00:54.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "d0aab55c-5f72-4eb0-91ba-bd86d15efd9c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 955110, "block_size": 4096, "block_total": 1011640, "block_used": 56530, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 3912130560, "size_total": 4143677440, "uuid": "d0aab55c-5f72-4eb0-91ba-bd86d15efd9c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.047) 0:00:54.552 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.044) 0:00:54.597 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.040) 0:00:54.637 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.041) 0:00:54.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.033) 0:00:54.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.032) 0:00:54.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.032) 0:00:54.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.033) 0:00:54.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.054) 0:00:54.867 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.036) 0:00:54.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.039) 0:00:54.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.032) 0:00:54.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.034) 0:00:55.009 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:30:31 +0000 (0:00:00.045) 0:00:55.054 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.039) 0:00:55.094 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104623.3641214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104623.3641214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 20527, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104623.3641214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.463) 0:00:55.557 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.042) 0:00:55.599 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.042) 0:00:55.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.041) 0:00:55.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.036) 0:00:55.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.042) 0:00:55.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.032) 0:00:55.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.033) 0:00:55.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.031) 0:00:55.860 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.040) 0:00:55.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.034) 0:00:55.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.045) 0:00:55.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.037) 0:00:56.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:30:32 +0000 (0:00:00.035) 0:00:56.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.037) 0:00:56.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.043) 0:00:56.134 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.037) 0:00:56.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.036) 0:00:56.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.033) 0:00:56.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.032) 0:00:56.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.035) 0:00:56.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.033) 0:00:56.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.033) 0:00:56.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.035) 0:00:56.411 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.032) 0:00:56.443 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.033) 0:00:56.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.036) 0:00:56.513 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.032) 0:00:56.546 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:30:33 +0000 (0:00:00.390) 0:00:56.937 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.399) 0:00:57.336 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.041) 0:00:57.378 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.038) 0:00:57.416 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.033) 0:00:57.449 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.033) 0:00:57.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.033) 0:00:57.516 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.032) 0:00:57.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.034) 0:00:57.583 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.040) 0:00:57.624 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.036) 0:00:57.660 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:30:34 +0000 (0:00:00.042) 0:00:57.703 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036497", "end": "2022-06-01 13:30:34.399152", "rc": 0, "start": "2022-06-01 13:30:34.362655" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.437) 0:00:58.140 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.039) 0:00:58.179 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.039) 0:00:58.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.033) 0:00:58.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.034) 0:00:58.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.032) 0:00:58.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.036) 0:00:58.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.032) 0:00:58.388 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.032) 0:00:58.420 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.028) 0:00:58.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Resizing with one large value which large than disk's size] ************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:123 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.079) 0:00:58.528 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.051) 0:00:58.579 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:35 +0000 (0:00:00.045) 0:00:58.625 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.540) 0:00:59.165 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.070) 0:00:59.236 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.032) 0:00:59.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.033) 0:00:59.302 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.066) 0:00:59.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.027) 0:00:59.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.031) 0:00:59.427 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "12884901888.0" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.044) 0:00:59.471 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.038) 0:00:59.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.032) 0:00:59.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.033) 0:00:59.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.033) 0:00:59.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.032) 0:00:59.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.046) 0:00:59.689 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:36 +0000 (0:00:00.032) 0:00:59.721 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: volume 'test1' cannot be resized to '12 GiB' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:30:38 +0000 (0:00:01.371) 0:01:01.092 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext4', u'mount_options': u'defaults', u'size': u'12884901888.0', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"volume 'test1' cannot be resized to '12 GiB'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.049) 0:01:01.142 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:142 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.031) 0:01:01.174 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output when resizing with large size] ************************* task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:148 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.039) 0:01:01.213 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the volume group created above] *********************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:155 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.038) 0:01:01.252 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.051) 0:01:01.303 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.046) 0:01:01.350 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.559) 0:01:01.909 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.076) 0:01:01.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.034) 0:01:02.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:38 +0000 (0:00:00.033) 0:01:02.053 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.066) 0:01:02.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.027) 0:01:02.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.033) 0:01:02.182 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "lvm" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.038) 0:01:02.220 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.034) 0:01:02.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.031) 0:01:02.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.032) 0:01:02.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.095) 0:01:02.415 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.033) 0:01:02.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.047) 0:01:02.496 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:39 +0000 (0:00:00.028) 0:01:02.525 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:30:41 +0000 (0:00:01.891) 0:01:04.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:41 +0000 (0:00:00.032) 0:01:04.449 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:30:41 +0000 (0:00:00.030) 0:01:04.480 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:30:41 +0000 (0:00:00.038) 0:01:04.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:30:41 +0000 (0:00:00.035) 0:01:04.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:30:41 +0000 (0:00:00.035) 0:01:04.589 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:30:41 +0000 (0:00:00.412) 0:01:05.002 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:30:42 +0000 (0:00:00.705) 0:01:05.708 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:30:42 +0000 (0:00:00.030) 0:01:05.739 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:30:43 +0000 (0:00:00.696) 0:01:06.436 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:30:43 +0000 (0:00:00.387) 0:01:06.823 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:30:43 +0000 (0:00:00.029) 0:01:06.853 ******** ok: [/cache/rhel-x.qcow2] TASK [Create one partition on one disk] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:165 Wednesday 01 June 2022 17:30:44 +0000 (0:00:00.881) 0:01:07.734 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:44 +0000 (0:00:00.053) 0:01:07.788 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:44 +0000 (0:00:00.044) 0:01:07.833 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.570) 0:01:08.403 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.067) 0:01:08.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.029) 0:01:08.501 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.031) 0:01:08.533 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.062) 0:01:08.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.025) 0:01:08.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.032) 0:01:08.653 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.037) 0:01:08.690 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.033) 0:01:08.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.032) 0:01:08.756 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.070) 0:01:08.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.030) 0:01:08.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.029) 0:01:08.886 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.045) 0:01:08.932 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:45 +0000 (0:00:00.029) 0:01:08.961 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:30:47 +0000 (0:00:01.760) 0:01:10.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:47 +0000 (0:00:00.031) 0:01:10.753 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:30:47 +0000 (0:00:00.030) 0:01:10.784 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:30:47 +0000 (0:00:00.041) 0:01:10.825 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:30:47 +0000 (0:00:00.049) 0:01:10.875 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:30:47 +0000 (0:00:00.035) 0:01:10.910 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:30:47 +0000 (0:00:00.034) 0:01:10.945 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:30:48 +0000 (0:00:00.673) 0:01:11.618 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=71d6faaf-e416-4810-8dfa-46609643404e', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:30:48 +0000 (0:00:00.423) 0:01:12.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:30:49 +0000 (0:00:00.668) 0:01:12.710 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:30:50 +0000 (0:00:00.398) 0:01:13.109 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:30:50 +0000 (0:00:00.034) 0:01:13.143 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:179 Wednesday 01 June 2022 17:30:50 +0000 (0:00:00.898) 0:01:14.042 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:30:51 +0000 (0:00:00.052) 0:01:14.095 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:30:51 +0000 (0:00:00.041) 0:01:14.136 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:30:51 +0000 (0:00:00.030) 0:01:14.166 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "71d6faaf-e416-4810-8dfa-46609643404e" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:30:51 +0000 (0:00:00.411) 0:01:14.577 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003044", "end": "2022-06-01 13:30:51.240407", "rc": 0, "start": "2022-06-01 13:30:51.237363" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=71d6faaf-e416-4810-8dfa-46609643404e /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:30:51 +0000 (0:00:00.402) 0:01:14.980 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003255", "end": "2022-06-01 13:30:51.635002", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:30:51.631747" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.396) 0:01:15.376 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.065) 0:01:15.442 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.034) 0:01:15.477 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.067) 0:01:15.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.033) 0:01:15.578 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.033) 0:01:15.611 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.034) 0:01:15.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.035) 0:01:15.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.091) 0:01:15.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.030) 0:01:15.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.028) 0:01:15.832 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.025) 0:01:15.857 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.056) 0:01:15.914 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.036) 0:01:15.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.037) 0:01:15.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.033) 0:01:16.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:30:52 +0000 (0:00:00.033) 0:01:16.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.032) 0:01:16.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.035) 0:01:16.121 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.035) 0:01:16.157 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.032) 0:01:16.189 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.058) 0:01:16.248 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71d6faaf-e416-4810-8dfa-46609643404e', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.043) 0:01:16.291 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.061) 0:01:16.353 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.036) 0:01:16.390 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.029) 0:01:16.419 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.030) 0:01:16.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.035) 0:01:16.485 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.064) 0:01:16.550 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=71d6faaf-e416-4810-8dfa-46609643404e', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.040) 0:01:16.590 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.033) 0:01:16.624 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.057) 0:01:16.681 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.035) 0:01:16.717 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.123) 0:01:16.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.038) 0:01:16.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "71d6faaf-e416-4810-8dfa-46609643404e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "71d6faaf-e416-4810-8dfa-46609643404e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.044) 0:01:16.924 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.039) 0:01:16.964 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.037) 0:01:17.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:30:53 +0000 (0:00:00.038) 0:01:17.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.030) 0:01:17.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.033) 0:01:17.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.030) 0:01:17.134 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.032) 0:01:17.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=71d6faaf-e416-4810-8dfa-46609643404e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.049) 0:01:17.217 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.036) 0:01:17.253 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.036) 0:01:17.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.035) 0:01:17.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.032) 0:01:17.357 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.039) 0:01:17.397 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.043) 0:01:17.441 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104646.9161215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104646.9161215, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 20694, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654104646.9161215, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.391) 0:01:17.833 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.039) 0:01:17.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.037) 0:01:17.910 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.087) 0:01:17.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:30:54 +0000 (0:00:00.033) 0:01:18.031 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.035) 0:01:18.067 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.031) 0:01:18.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.031) 0:01:18.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:18.163 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.042) 0:01:18.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.033) 0:01:18.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:18.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:18.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.034) 0:01:18.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.030) 0:01:18.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.039) 0:01:18.410 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.034) 0:01:18.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.030) 0:01:18.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.028) 0:01:18.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.028) 0:01:18.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.029) 0:01:18.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.034) 0:01:18.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.033) 0:01:18.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.035) 0:01:18.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:18.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.030) 0:01:18.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:18.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.036) 0:01:18.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.033) 0:01:18.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:18.864 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:18.896 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.034) 0:01:18.930 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.031) 0:01:18.962 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.034) 0:01:18.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:30:55 +0000 (0:00:00.032) 0:01:19.029 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.033) 0:01:19.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.034) 0:01:19.097 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.036) 0:01:19.134 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.035) 0:01:19.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.033) 0:01:19.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.030) 0:01:19.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.030) 0:01:19.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.033) 0:01:19.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.032) 0:01:19.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.032) 0:01:19.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.035) 0:01:19.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.034) 0:01:19.432 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.033) 0:01:19.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.031) 0:01:19.497 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.029) 0:01:19.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Test setting up disk volume will remove the partition create above] ****** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:181 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.033) 0:01:19.560 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.064) 0:01:19.625 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:30:56 +0000 (0:00:00.045) 0:01:19.670 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.559) 0:01:20.230 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.076) 0:01:20.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.032) 0:01:20.339 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.032) 0:01:20.371 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.104) 0:01:20.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.027) 0:01:20.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.030) 0:01:20.533 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.033) 0:01:20.567 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_create_options": "-F", "fs_type": "ext4", "mount_options": "rw,noatime,defaults", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.037) 0:01:20.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.032) 0:01:20.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.030) 0:01:20.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.032) 0:01:20.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.031) 0:01:20.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.046) 0:01:20.778 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:30:57 +0000 (0:00:00.030) 0:01:20.808 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "rw,noatime,defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:30:59 +0000 (0:00:01.776) 0:01:22.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:30:59 +0000 (0:00:00.032) 0:01:22.618 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:30:59 +0000 (0:00:00.029) 0:01:22.647 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "state": "absent" }, { "dump": 0, "fstype": "ext4", "opts": "rw,noatime,defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:30:59 +0000 (0:00:00.041) 0:01:22.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:30:59 +0000 (0:00:00.040) 0:01:22.729 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:30:59 +0000 (0:00:00.043) 0:01:22.772 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=71d6faaf-e416-4810-8dfa-46609643404e', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=71d6faaf-e416-4810-8dfa-46609643404e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:31:00 +0000 (0:00:00.403) 0:01:23.176 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:31:00 +0000 (0:00:00.673) 0:01:23.850 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'rw,noatime,defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "rw,noatime,defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "state": "mounted" }, "name": "/opt/test1", "opts": "rw,noatime,defaults", "passno": "0", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:31:01 +0000 (0:00:00.433) 0:01:24.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:31:01 +0000 (0:00:00.697) 0:01:24.981 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:31:02 +0000 (0:00:00.412) 0:01:25.394 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:31:02 +0000 (0:00:00.031) 0:01:25.425 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:194 Wednesday 01 June 2022 17:31:03 +0000 (0:00:00.904) 0:01:26.329 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:31:03 +0000 (0:00:00.055) 0:01:26.385 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:31:03 +0000 (0:00:00.032) 0:01:26.418 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "-F", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "rw,noatime,defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:31:03 +0000 (0:00:00.040) 0:01:26.458 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "ext4", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "69237b39-20f5-4ada-9964-326a8ba4cf0b" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:31:03 +0000 (0:00:00.407) 0:01:26.866 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003056", "end": "2022-06-01 13:31:03.504852", "rc": 0, "start": "2022-06-01 13:31:03.501796" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b /opt/test1 ext4 rw,noatime,defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.425) 0:01:27.291 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002959", "end": "2022-06-01 13:31:03.945118", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:31:03.942159" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.394) 0:01:27.686 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.029) 0:01:27.715 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.032) 0:01:27.748 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.063) 0:01:27.811 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.039) 0:01:27.851 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.120) 0:01:27.971 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.035) 0:01:28.006 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,noatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "69237b39-20f5-4ada-9964-326a8ba4cf0b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/sda", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,noatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "69237b39-20f5-4ada-9964-326a8ba4cf0b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:31:04 +0000 (0:00:00.040) 0:01:28.047 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.037) 0:01:28.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.036) 0:01:28.121 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.039) 0:01:28.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.032) 0:01:28.193 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.033) 0:01:28.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.032) 0:01:28.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.032) 0:01:28.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 rw,noatime,defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.054) 0:01:28.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.036) 0:01:28.383 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.040) 0:01:28.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.044) 0:01:28.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.031) 0:01:28.499 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.037) 0:01:28.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.035) 0:01:28.572 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104658.7831216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104658.7831216, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 312, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654104658.7831216, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.412) 0:01:28.984 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:31:05 +0000 (0:00:00.038) 0:01:29.023 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.037) 0:01:29.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.034) 0:01:29.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.028) 0:01:29.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.035) 0:01:29.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.030) 0:01:29.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.033) 0:01:29.255 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.038) 0:01:29.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.030) 0:01:29.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.034) 0:01:29.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.039) 0:01:29.491 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.036) 0:01:29.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.030) 0:01:29.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.030) 0:01:29.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.034) 0:01:29.656 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.036) 0:01:29.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.034) 0:01:29.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.036) 0:01:29.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.034) 0:01:29.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.083) 0:01:29.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.031) 0:01:29.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.032) 0:01:30.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:31:06 +0000 (0:00:00.030) 0:01:30.039 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.035) 0:01:30.074 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.035) 0:01:30.110 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.032) 0:01:30.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.033) 0:01:30.176 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.032) 0:01:30.209 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.032) 0:01:30.241 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.032) 0:01:30.274 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.038) 0:01:30.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.031) 0:01:30.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.031) 0:01:30.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.033) 0:01:30.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.032) 0:01:30.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.033) 0:01:30.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.035) 0:01:30.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.031) 0:01:30.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.032) 0:01:30.573 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.032) 0:01:30.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the disk volume created above] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:198 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.030) 0:01:30.636 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.078) 0:01:30.715 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:31:07 +0000 (0:00:00.044) 0:01:30.759 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.552) 0:01:31.312 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.072) 0:01:31.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.032) 0:01:31.417 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.031) 0:01:31.449 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.065) 0:01:31.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.027) 0:01:31.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.033) 0:01:31.575 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.034) 0:01:31.609 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.035) 0:01:31.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.034) 0:01:31.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.033) 0:01:31.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.032) 0:01:31.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.032) 0:01:31.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.046) 0:01:31.824 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:31:08 +0000 (0:00:00.029) 0:01:31.853 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:31:10 +0000 (0:00:01.397) 0:01:33.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:31:10 +0000 (0:00:00.028) 0:01:33.280 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:31:10 +0000 (0:00:00.026) 0:01:33.306 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:31:10 +0000 (0:00:00.036) 0:01:33.342 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:31:10 +0000 (0:00:00.036) 0:01:33.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:31:10 +0000 (0:00:00.036) 0:01:33.415 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=69237b39-20f5-4ada-9964-326a8ba4cf0b" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:31:10 +0000 (0:00:00.408) 0:01:33.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:31:11 +0000 (0:00:00.690) 0:01:34.514 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:31:11 +0000 (0:00:00.031) 0:01:34.546 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:31:12 +0000 (0:00:00.707) 0:01:35.254 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:31:12 +0000 (0:00:00.402) 0:01:35.656 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:31:12 +0000 (0:00:00.032) 0:01:35.689 ******** ok: [/cache/rhel-x.qcow2] TASK [Try to mount swap filesystem to "/opt/test1"] **************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:210 Wednesday 01 June 2022 17:31:13 +0000 (0:00:00.882) 0:01:36.571 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:31:13 +0000 (0:00:00.050) 0:01:36.621 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:31:13 +0000 (0:00:00.044) 0:01:36.666 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.540) 0:01:37.207 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.075) 0:01:37.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.032) 0:01:37.315 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.032) 0:01:37.348 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.065) 0:01:37.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.030) 0:01:37.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.032) 0:01:37.476 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.035) 0:01:37.511 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda" ], "fs_type": "swap", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.038) 0:01:37.550 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.031) 0:01:37.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.030) 0:01:37.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.034) 0:01:37.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.029) 0:01:37.676 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.045) 0:01:37.721 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:31:14 +0000 (0:00:00.027) 0:01:37.748 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: volume 'test1' has a mount point but no mountable file system TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:31:15 +0000 (0:00:01.131) 0:01:38.880 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'swap', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'cache_mode': None, u'name': u'test1', u'cached': False, u'type': u'disk', u'disks': [u'sda'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"volume 'test1' has a mount point but no mountable file system"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:31:15 +0000 (0:00:00.044) 0:01:38.925 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:226 Wednesday 01 June 2022 17:31:15 +0000 (0:00:00.031) 0:01:38.956 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output when mount swap filesystem to "/opt/test1"] ************ task path: /tmp/tmp7247_7fr/tests/tests_misc.yml:232 Wednesday 01 June 2022 17:31:15 +0000 (0:00:00.038) 0:01:38.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=474 changed=16 unreachable=0 failed=3 skipped=357 rescued=3 ignored=0 Wednesday 01 June 2022 17:31:15 +0000 (0:00:00.021) 0:01:39.017 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 2.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.76s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.40s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.14s /tmp/tmp7247_7fr/tests/tests_misc_scsi_generated.yml:3 ------------------------ linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.13s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:31:16 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:31:18 +0000 (0:00:01.353) 0:00:01.376 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_missing_volume_type_in_pool.yml ******************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:2 Wednesday 01 June 2022 17:31:18 +0000 (0:00:00.013) 0:00:01.390 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:9 Wednesday 01 June 2022 17:31:19 +0000 (0:00:01.120) 0:00:02.511 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:31:19 +0000 (0:00:00.040) 0:00:02.551 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:31:19 +0000 (0:00:00.162) 0:00:02.713 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:31:19 +0000 (0:00:00.538) 0:00:03.252 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:31:20 +0000 (0:00:00.076) 0:00:03.329 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:31:20 +0000 (0:00:00.034) 0:00:03.364 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:31:20 +0000 (0:00:00.034) 0:00:03.399 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:31:20 +0000 (0:00:00.196) 0:00:03.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:31:20 +0000 (0:00:00.019) 0:00:03.615 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:31:21 +0000 (0:00:01.081) 0:00:04.696 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:31:21 +0000 (0:00:00.047) 0:00:04.744 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:31:21 +0000 (0:00:00.045) 0:00:04.789 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:31:22 +0000 (0:00:00.747) 0:00:05.537 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:31:22 +0000 (0:00:00.083) 0:00:05.620 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:31:22 +0000 (0:00:00.020) 0:00:05.641 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:31:22 +0000 (0:00:00.021) 0:00:05.662 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:31:22 +0000 (0:00:00.019) 0:00:05.682 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:31:23 +0000 (0:00:00.922) 0:00:06.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:31:26 +0000 (0:00:02.918) 0:00:09.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:31:26 +0000 (0:00:00.044) 0:00:09.567 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:31:26 +0000 (0:00:00.028) 0:00:09.595 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:31:26 +0000 (0:00:00.546) 0:00:10.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:31:26 +0000 (0:00:00.029) 0:00:10.171 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:31:26 +0000 (0:00:00.026) 0:00:10.198 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:31:26 +0000 (0:00:00.031) 0:00:10.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:31:26 +0000 (0:00:00.032) 0:00:10.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:31:27 +0000 (0:00:00.034) 0:00:10.296 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:31:27 +0000 (0:00:00.060) 0:00:10.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:31:27 +0000 (0:00:00.029) 0:00:10.386 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:31:27 +0000 (0:00:00.027) 0:00:10.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:31:27 +0000 (0:00:00.029) 0:00:10.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:31:27 +0000 (0:00:00.488) 0:00:10.932 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:31:27 +0000 (0:00:00.030) 0:00:10.963 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:12 Wednesday 01 June 2022 17:31:28 +0000 (0:00:00.912) 0:00:11.876 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:19 Wednesday 01 June 2022 17:31:28 +0000 (0:00:00.030) 0:00:11.906 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:31:28 +0000 (0:00:00.045) 0:00:11.952 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:31:29 +0000 (0:00:00.551) 0:00:12.504 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:31:29 +0000 (0:00:00.036) 0:00:12.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:31:29 +0000 (0:00:00.028) 0:00:12.569 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a partition device mounted on "/opt/test1"] *********************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:23 Wednesday 01 June 2022 17:31:29 +0000 (0:00:00.038) 0:00:12.608 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:31:29 +0000 (0:00:00.056) 0:00:12.665 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:31:29 +0000 (0:00:00.042) 0:00:12.707 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:31:29 +0000 (0:00:00.539) 0:00:13.246 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.070) 0:00:13.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.031) 0:00:13.348 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.032) 0:00:13.380 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.062) 0:00:13.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.025) 0:00:13.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.030) 0:00:13.498 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.036) 0:00:13.534 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.033) 0:00:13.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.030) 0:00:13.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.033) 0:00:13.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.063) 0:00:13.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.031) 0:00:13.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.043) 0:00:13.770 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:31:30 +0000 (0:00:00.028) 0:00:13.798 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:31:32 +0000 (0:00:01.785) 0:00:15.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:31:32 +0000 (0:00:00.031) 0:00:15.615 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:31:32 +0000 (0:00:00.040) 0:00:15.656 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:31:32 +0000 (0:00:00.048) 0:00:15.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:31:32 +0000 (0:00:00.039) 0:00:15.743 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:31:32 +0000 (0:00:00.036) 0:00:15.780 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:31:32 +0000 (0:00:00.035) 0:00:15.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:31:33 +0000 (0:00:01.006) 0:00:16.822 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=372ae1bd-0496-4063-aec0-a4f928956b39', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:31:34 +0000 (0:00:00.566) 0:00:17.389 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:31:34 +0000 (0:00:00.690) 0:00:18.079 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:31:35 +0000 (0:00:00.392) 0:00:18.472 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:31:35 +0000 (0:00:00.030) 0:00:18.502 ******** ok: [/cache/rhel-x.qcow2] TASK [Ensure the inherited type is reflected in blivet module output] ********** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:36 Wednesday 01 June 2022 17:31:36 +0000 (0:00:00.927) 0:00:19.429 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:41 Wednesday 01 June 2022 17:31:36 +0000 (0:00:00.037) 0:00:19.466 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:31:36 +0000 (0:00:00.053) 0:00:19.519 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:31:36 +0000 (0:00:00.037) 0:00:19.557 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:31:36 +0000 (0:00:00.027) 0:00:19.585 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "372ae1bd-0496-4063-aec0-a4f928956b39" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:31:36 +0000 (0:00:00.537) 0:00:20.122 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003863", "end": "2022-06-01 13:31:36.638734", "rc": 0, "start": "2022-06-01 13:31:36.634871" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=372ae1bd-0496-4063-aec0-a4f928956b39 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.484) 0:00:20.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003665", "end": "2022-06-01 13:31:37.036242", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:31:37.032577" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.400) 0:00:21.008 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.064) 0:00:21.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.031) 0:00:21.105 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.067) 0:00:21.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.036) 0:00:21.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.037) 0:00:21.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:31:37 +0000 (0:00:00.031) 0:00:21.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.030) 0:00:21.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.029) 0:00:21.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.033) 0:00:21.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.032) 0:00:21.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.033) 0:00:21.438 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.066) 0:00:21.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.036) 0:00:21.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.031) 0:00:21.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.033) 0:00:21.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.032) 0:00:21.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.032) 0:00:21.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.031) 0:00:21.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.034) 0:00:21.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.031) 0:00:21.769 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.057) 0:00:21.827 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=372ae1bd-0496-4063-aec0-a4f928956b39', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.047) 0:00:21.875 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.066) 0:00:21.941 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.037) 0:00:21.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.035) 0:00:22.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.033) 0:00:22.047 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.035) 0:00:22.083 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.067) 0:00:22.150 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=372ae1bd-0496-4063-aec0-a4f928956b39', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.043) 0:00:22.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:31:38 +0000 (0:00:00.032) 0:00:22.225 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.061) 0:00:22.286 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.035) 0:00:22.322 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.162) 0:00:22.485 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.034) 0:00:22.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "372ae1bd-0496-4063-aec0-a4f928956b39" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "372ae1bd-0496-4063-aec0-a4f928956b39" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.043) 0:00:22.563 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.039) 0:00:22.603 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.036) 0:00:22.640 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.047) 0:00:22.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.030) 0:00:22.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.029) 0:00:22.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.030) 0:00:22.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.035) 0:00:22.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=372ae1bd-0496-4063-aec0-a4f928956b39 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.051) 0:00:22.865 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.039) 0:00:22.904 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.044) 0:00:22.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.038) 0:00:22.988 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.039) 0:00:23.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.040) 0:00:23.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:31:39 +0000 (0:00:00.040) 0:00:23.108 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104691.5561216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104691.5561216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 20924, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654104691.5561216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.414) 0:00:23.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.038) 0:00:23.561 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.036) 0:00:23.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.036) 0:00:23.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.032) 0:00:23.666 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.036) 0:00:23.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.031) 0:00:23.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.031) 0:00:23.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.032) 0:00:23.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.038) 0:00:23.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.034) 0:00:23.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.032) 0:00:23.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.033) 0:00:23.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.032) 0:00:23.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.032) 0:00:24.002 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.039) 0:00:24.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.040) 0:00:24.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.031) 0:00:24.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.031) 0:00:24.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.029) 0:00:24.174 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.029) 0:00:24.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.029) 0:00:24.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:31:40 +0000 (0:00:00.032) 0:00:24.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.032) 0:00:24.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.033) 0:00:24.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.032) 0:00:24.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.032) 0:00:24.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.031) 0:00:24.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.035) 0:00:24.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.033) 0:00:24.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.032) 0:00:24.531 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.033) 0:00:24.564 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.031) 0:00:24.596 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.031) 0:00:24.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.034) 0:00:24.663 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.032) 0:00:24.695 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.033) 0:00:24.729 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.035) 0:00:24.765 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.034) 0:00:24.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.037) 0:00:24.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.036) 0:00:24.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.076) 0:00:24.951 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.033) 0:00:24.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.030) 0:00:25.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.029) 0:00:25.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.031) 0:00:25.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.031) 0:00:25.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.032) 0:00:25.139 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.029) 0:00:25.169 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.031) 0:00:25.201 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the partition created above] ************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:43 Wednesday 01 June 2022 17:31:41 +0000 (0:00:00.032) 0:00:25.234 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.065) 0:00:25.299 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.045) 0:00:25.345 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.545) 0:00:25.890 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.076) 0:00:25.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.033) 0:00:26.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.032) 0:00:26.032 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.061) 0:00:26.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.029) 0:00:26.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.034) 0:00:26.157 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.043) 0:00:26.201 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.035) 0:00:26.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:31:42 +0000 (0:00:00.033) 0:00:26.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:31:43 +0000 (0:00:00.032) 0:00:26.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:31:43 +0000 (0:00:00.036) 0:00:26.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:31:43 +0000 (0:00:00.033) 0:00:26.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:31:43 +0000 (0:00:00.047) 0:00:26.420 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:31:43 +0000 (0:00:00.028) 0:00:26.448 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:31:44 +0000 (0:00:01.658) 0:00:28.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:31:44 +0000 (0:00:00.034) 0:00:28.141 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:31:44 +0000 (0:00:00.029) 0:00:28.171 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:31:44 +0000 (0:00:00.041) 0:00:28.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:31:44 +0000 (0:00:00.050) 0:00:28.263 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:31:45 +0000 (0:00:00.039) 0:00:28.302 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=372ae1bd-0496-4063-aec0-a4f928956b39', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:31:45 +0000 (0:00:00.420) 0:00:28.723 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:31:46 +0000 (0:00:00.696) 0:00:29.419 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:31:46 +0000 (0:00:00.075) 0:00:29.494 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:31:47 +0000 (0:00:00.853) 0:00:30.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:31:47 +0000 (0:00:00.390) 0:00:30.738 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:31:47 +0000 (0:00:00.030) 0:00:30.769 ******** ok: [/cache/rhel-x.qcow2] TASK [Ensure the inherited type is reflected in blivet module output] ********** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:57 Wednesday 01 June 2022 17:31:48 +0000 (0:00:00.882) 0:00:31.651 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:62 Wednesday 01 June 2022 17:31:48 +0000 (0:00:00.036) 0:00:31.687 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:31:48 +0000 (0:00:00.054) 0:00:31.742 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:31:48 +0000 (0:00:00.039) 0:00:31.781 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:31:48 +0000 (0:00:00.031) 0:00:31.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:31:48 +0000 (0:00:00.395) 0:00:32.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002914", "end": "2022-06-01 13:31:48.634036", "rc": 0, "start": "2022-06-01 13:31:48.631122" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.393) 0:00:32.601 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003285", "end": "2022-06-01 13:31:49.016543", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:31:49.013258" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.386) 0:00:32.988 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.065) 0:00:33.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.029) 0:00:33.083 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.060) 0:00:33.143 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.034) 0:00:33.178 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.030) 0:00:33.208 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.029) 0:00:33.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:31:49 +0000 (0:00:00.030) 0:00:33.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.030) 0:00:33.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.033) 0:00:33.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.035) 0:00:33.367 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.029) 0:00:33.396 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.054) 0:00:33.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.033) 0:00:33.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.036) 0:00:33.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.075) 0:00:33.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.033) 0:00:33.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.032) 0:00:33.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.031) 0:00:33.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.031) 0:00:33.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.032) 0:00:33.757 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.061) 0:00:33.819 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=372ae1bd-0496-4063-aec0-a4f928956b39', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.043) 0:00:33.862 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.063) 0:00:33.925 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.037) 0:00:33.963 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.029) 0:00:33.993 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.030) 0:00:34.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.032) 0:00:34.056 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.065) 0:00:34.122 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=372ae1bd-0496-4063-aec0-a4f928956b39', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=372ae1bd-0496-4063-aec0-a4f928956b39", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.041) 0:00:34.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.032) 0:00:34.196 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:31:50 +0000 (0:00:00.060) 0:00:34.257 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.038) 0:00:34.295 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.118) 0:00:34.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.036) 0:00:34.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.041) 0:00:34.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.031) 0:00:34.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.037) 0:00:34.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.031) 0:00:34.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.032) 0:00:34.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.031) 0:00:34.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.032) 0:00:34.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.033) 0:00:34.720 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.047) 0:00:34.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.026) 0:00:34.795 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.039) 0:00:34.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.032) 0:00:34.867 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.032) 0:00:34.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.033) 0:00:34.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:31:51 +0000 (0:00:00.028) 0:00:34.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.394) 0:00:35.356 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.038) 0:00:35.395 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.027) 0:00:35.423 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.035) 0:00:35.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.032) 0:00:35.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.026) 0:00:35.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.032) 0:00:35.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.033) 0:00:35.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.034) 0:00:35.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.027) 0:00:35.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.029) 0:00:35.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.031) 0:00:35.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.030) 0:00:35.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.030) 0:00:35.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.033) 0:00:35.801 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.094) 0:00:35.895 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.037) 0:00:35.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.032) 0:00:35.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.033) 0:00:35.999 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.034) 0:00:36.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.031) 0:00:36.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.035) 0:00:36.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.032) 0:00:36.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.035) 0:00:36.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.036) 0:00:36.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.032) 0:00:36.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:31:52 +0000 (0:00:00.031) 0:00:36.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.046) 0:00:36.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.045) 0:00:36.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.044) 0:00:36.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.038) 0:00:36.443 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.036) 0:00:36.479 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.032) 0:00:36.512 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.033) 0:00:36.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.032) 0:00:36.578 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.033) 0:00:36.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.031) 0:00:36.643 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.035) 0:00:36.679 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.034) 0:00:36.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.033) 0:00:36.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.032) 0:00:36.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.031) 0:00:36.811 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.032) 0:00:36.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.032) 0:00:36.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.032) 0:00:36.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.036) 0:00:36.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.031) 0:00:36.976 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.033) 0:00:37.010 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.031) 0:00:37.041 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.027) 0:00:37.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:64 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.039) 0:00:37.108 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.068) 0:00:37.176 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:31:53 +0000 (0:00:00.044) 0:00:37.221 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.561) 0:00:37.782 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.070) 0:00:37.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.031) 0:00:37.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.031) 0:00:37.916 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.065) 0:00:37.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.027) 0:00:38.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.031) 0:00:38.041 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.046) 0:00:38.087 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.043) 0:00:38.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.032) 0:00:38.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.029) 0:00:38.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.030) 0:00:38.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:31:54 +0000 (0:00:00.033) 0:00:38.256 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:31:55 +0000 (0:00:00.048) 0:00:38.304 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:31:55 +0000 (0:00:00.032) 0:00:38.337 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:31:56 +0000 (0:00:01.316) 0:00:39.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.032) 0:00:39.686 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.029) 0:00:39.716 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.039) 0:00:39.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.040) 0:00:39.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.037) 0:00:39.834 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.031) 0:00:39.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.035) 0:00:39.901 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.037) 0:00:39.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:31:56 +0000 (0:00:00.031) 0:00:39.971 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:31:57 +0000 (0:00:00.403) 0:00:40.374 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:31:57 +0000 (0:00:00.030) 0:00:40.404 ******** ok: [/cache/rhel-x.qcow2] TASK [Ensure the inherited type is reflected in blivet module output] ********** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:78 Wednesday 01 June 2022 17:31:58 +0000 (0:00:00.895) 0:00:41.300 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:83 Wednesday 01 June 2022 17:31:58 +0000 (0:00:00.038) 0:00:41.338 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:31:58 +0000 (0:00:00.060) 0:00:41.398 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:31:58 +0000 (0:00:00.041) 0:00:41.440 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:31:58 +0000 (0:00:00.031) 0:00:41.471 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:31:58 +0000 (0:00:00.404) 0:00:41.876 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003143", "end": "2022-06-01 13:31:58.299827", "rc": 0, "start": "2022-06-01 13:31:58.296684" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:31:58 +0000 (0:00:00.394) 0:00:42.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003043", "end": "2022-06-01 13:31:58.695302", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:31:58.692259" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.396) 0:00:42.666 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.066) 0:00:42.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.031) 0:00:42.765 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.074) 0:00:42.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.038) 0:00:42.878 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.081) 0:00:42.959 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.029) 0:00:42.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.044) 0:00:43.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.032) 0:00:43.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.032) 0:00:43.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.032) 0:00:43.130 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.032) 0:00:43.163 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.059) 0:00:43.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:31:59 +0000 (0:00:00.032) 0:00:43.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.032) 0:00:43.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.036) 0:00:43.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.033) 0:00:43.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.030) 0:00:43.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.032) 0:00:43.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.030) 0:00:43.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.031) 0:00:43.482 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.061) 0:00:43.543 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.042) 0:00:43.586 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.062) 0:00:43.649 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.039) 0:00:43.688 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.029) 0:00:43.718 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.027) 0:00:43.745 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.029) 0:00:43.775 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.063) 0:00:43.838 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.050) 0:00:43.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.032) 0:00:43.921 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.061) 0:00:43.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.037) 0:00:44.020 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.117) 0:00:44.137 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.040) 0:00:44.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.043) 0:00:44.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:32:00 +0000 (0:00:00.032) 0:00:44.254 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.035) 0:00:44.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.032) 0:00:44.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.037) 0:00:44.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.036) 0:00:44.397 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.036) 0:00:44.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.038) 0:00:44.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.048) 0:00:44.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.026) 0:00:44.546 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.035) 0:00:44.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.029) 0:00:44.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.033) 0:00:44.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.031) 0:00:44.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.028) 0:00:44.704 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.411) 0:00:45.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.039) 0:00:45.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.029) 0:00:45.185 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:32:01 +0000 (0:00:00.036) 0:00:45.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.092) 0:00:45.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.027) 0:00:45.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.032) 0:00:45.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.032) 0:00:45.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.032) 0:00:45.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.032) 0:00:45.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.038) 0:00:45.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.031) 0:00:45.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.029) 0:00:45.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.029) 0:00:45.600 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.032) 0:00:45.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.040) 0:00:45.674 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.040) 0:00:45.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.031) 0:00:45.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.032) 0:00:45.779 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.033) 0:00:45.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.036) 0:00:45.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.039) 0:00:45.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.036) 0:00:45.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.038) 0:00:45.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.035) 0:00:46.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.034) 0:00:46.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.033) 0:00:46.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.031) 0:00:46.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.034) 0:00:46.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.032) 0:00:46.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.034) 0:00:46.200 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.035) 0:00:46.236 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:32:02 +0000 (0:00:00.031) 0:00:46.267 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.032) 0:00:46.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.037) 0:00:46.336 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.034) 0:00:46.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.035) 0:00:46.407 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.036) 0:00:46.444 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.035) 0:00:46.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.032) 0:00:46.512 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.035) 0:00:46.548 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.032) 0:00:46.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.034) 0:00:46.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.032) 0:00:46.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.038) 0:00:46.686 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.034) 0:00:46.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.036) 0:00:46.758 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.035) 0:00:46.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.032) 0:00:46.825 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.033) 0:00:46.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=227 changed=5 unreachable=0 failed=0 skipped=245 rescued=0 ignored=0 Wednesday 01 June 2022 17:32:03 +0000 (0:00:00.017) 0:00:46.876 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 2.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.79s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.12s /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:2 ---------------- linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : get required packages ---------------------- 0.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : set up new/current mounts ------------------ 0.57s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Find unused disks in the system ----------------------------------------- 0.55s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:32:04 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:32:05 +0000 (0:00:01.316) 0:00:01.338 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_missing_volume_type_in_pool_nvme_generated.yml ***************** 2 plays in /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:32:05 +0000 (0:00:00.017) 0:00:01.356 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:32:06 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:32:07 +0000 (0:00:01.347) 0:00:01.370 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_missing_volume_type_in_pool_scsi_generated.yml ***************** 2 plays in /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool_scsi_generated.yml:3 Wednesday 01 June 2022 17:32:07 +0000 (0:00:00.016) 0:00:01.387 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool_scsi_generated.yml:7 Wednesday 01 June 2022 17:32:08 +0000 (0:00:01.124) 0:00:02.511 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:2 Wednesday 01 June 2022 17:32:08 +0000 (0:00:00.026) 0:00:02.538 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:9 Wednesday 01 June 2022 17:32:09 +0000 (0:00:00.838) 0:00:03.376 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:32:09 +0000 (0:00:00.040) 0:00:03.417 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:32:10 +0000 (0:00:00.166) 0:00:03.583 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:32:10 +0000 (0:00:00.561) 0:00:04.145 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:32:10 +0000 (0:00:00.077) 0:00:04.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:32:10 +0000 (0:00:00.023) 0:00:04.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:32:10 +0000 (0:00:00.024) 0:00:04.271 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:32:10 +0000 (0:00:00.198) 0:00:04.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:32:10 +0000 (0:00:00.019) 0:00:04.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:32:12 +0000 (0:00:01.090) 0:00:05.579 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:32:12 +0000 (0:00:00.046) 0:00:05.626 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:32:12 +0000 (0:00:00.048) 0:00:05.674 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:32:12 +0000 (0:00:00.724) 0:00:06.398 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:32:12 +0000 (0:00:00.090) 0:00:06.488 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:32:12 +0000 (0:00:00.021) 0:00:06.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:32:12 +0000 (0:00:00.021) 0:00:06.531 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:32:12 +0000 (0:00:00.018) 0:00:06.550 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:32:13 +0000 (0:00:00.829) 0:00:07.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:32:15 +0000 (0:00:01.903) 0:00:09.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:32:15 +0000 (0:00:00.043) 0:00:09.326 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:32:15 +0000 (0:00:00.026) 0:00:09.353 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.558) 0:00:09.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.030) 0:00:09.942 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.028) 0:00:09.970 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.034) 0:00:10.004 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.033) 0:00:10.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.040) 0:00:10.078 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.036) 0:00:10.114 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.032) 0:00:10.146 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.036) 0:00:10.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:32:16 +0000 (0:00:00.032) 0:00:10.215 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:32:17 +0000 (0:00:00.508) 0:00:10.724 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:32:17 +0000 (0:00:00.029) 0:00:10.753 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:12 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.868) 0:00:11.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:19 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.029) 0:00:11.651 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.042) 0:00:11.694 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.555) 0:00:12.249 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.036) 0:00:12.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.030) 0:00:12.317 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda" ] } TASK [Create a partition device mounted on "/opt/test1"] *********************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:23 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.035) 0:00:12.352 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.058) 0:00:12.410 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:32:18 +0000 (0:00:00.044) 0:00:12.455 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.528) 0:00:12.984 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.073) 0:00:13.057 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.070) 0:00:13.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.032) 0:00:13.160 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.062) 0:00:13.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.025) 0:00:13.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.033) 0:00:13.282 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "type": "partition", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.037) 0:00:13.320 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.033) 0:00:13.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.031) 0:00:13.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.032) 0:00:13.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.031) 0:00:13.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.031) 0:00:13.479 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.043) 0:00:13.523 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:32:19 +0000 (0:00:00.027) 0:00:13.550 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:32:21 +0000 (0:00:01.784) 0:00:15.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:32:21 +0000 (0:00:00.031) 0:00:15.366 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:32:21 +0000 (0:00:00.026) 0:00:15.393 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/sda1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:32:21 +0000 (0:00:00.044) 0:00:15.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:32:21 +0000 (0:00:00.037) 0:00:15.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:32:21 +0000 (0:00:00.034) 0:00:15.509 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:32:21 +0000 (0:00:00.029) 0:00:15.538 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:32:22 +0000 (0:00:01.001) 0:00:16.539 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=3b87cec2-c2a3-4310-86a3-483aec835b37', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:32:23 +0000 (0:00:00.585) 0:00:17.125 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:32:24 +0000 (0:00:00.690) 0:00:17.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:32:24 +0000 (0:00:00.405) 0:00:18.221 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:32:24 +0000 (0:00:00.030) 0:00:18.251 ******** ok: [/cache/rhel-x.qcow2] TASK [Ensure the inherited type is reflected in blivet module output] ********** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:36 Wednesday 01 June 2022 17:32:25 +0000 (0:00:00.892) 0:00:19.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:41 Wednesday 01 June 2022 17:32:25 +0000 (0:00:00.037) 0:00:19.180 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:32:25 +0000 (0:00:00.050) 0:00:19.231 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:32:25 +0000 (0:00:00.039) 0:00:19.270 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:32:25 +0000 (0:00:00.064) 0:00:19.335 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "ext4", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "3b87cec2-c2a3-4310-86a3-483aec835b37" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:32:26 +0000 (0:00:00.488) 0:00:19.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003142", "end": "2022-06-01 13:32:26.074085", "rc": 0, "start": "2022-06-01 13:32:26.070943" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3b87cec2-c2a3-4310-86a3-483aec835b37 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:32:26 +0000 (0:00:00.511) 0:00:20.336 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003013", "end": "2022-06-01 13:32:26.475438", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:32:26.472425" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.389) 0:00:20.725 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.062) 0:00:20.788 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.030) 0:00:20.818 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.063) 0:00:20.881 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.031) 0:00:20.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.031) 0:00:20.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.033) 0:00:20.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.039) 0:00:21.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.034) 0:00:21.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.032) 0:00:21.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.031) 0:00:21.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.033) 0:00:21.150 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.061) 0:00:21.211 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.032) 0:00:21.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.030) 0:00:21.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.031) 0:00:21.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.032) 0:00:21.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.029) 0:00:21.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.031) 0:00:21.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.034) 0:00:21.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.040) 0:00:21.474 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:32:27 +0000 (0:00:00.061) 0:00:21.536 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=3b87cec2-c2a3-4310-86a3-483aec835b37', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.043) 0:00:21.579 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.063) 0:00:21.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.038) 0:00:21.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.035) 0:00:21.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.032) 0:00:21.749 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.032) 0:00:21.781 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.064) 0:00:21.846 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/sda1', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=3b87cec2-c2a3-4310-86a3-483aec835b37', u'raid_spare_count': None, u'name': u'test1', u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "present", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.042) 0:00:21.888 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.031) 0:00:21.920 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.058) 0:00:21.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.079) 0:00:22.057 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.124) 0:00:22.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.036) 0:00:22.218 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "3b87cec2-c2a3-4310-86a3-483aec835b37" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419276, "block_size": 4096, "block_total": 2554437, "block_used": 135161, "device": "/dev/sda1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9909354496, "size_total": 10462973952, "uuid": "3b87cec2-c2a3-4310-86a3-483aec835b37" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.042) 0:00:22.260 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.036) 0:00:22.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.036) 0:00:22.333 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.038) 0:00:22.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.030) 0:00:22.402 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.032) 0:00:22.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.030) 0:00:22.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.034) 0:00:22.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:32:28 +0000 (0:00:00.048) 0:00:22.547 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.034) 0:00:22.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.036) 0:00:22.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.031) 0:00:22.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.032) 0:00:22.682 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.036) 0:00:22.719 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.037) 0:00:22.756 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104741.0161216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104741.0161216, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 21122, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654104741.0161216, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.391) 0:00:23.148 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.037) 0:00:23.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.043) 0:00:23.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.042) 0:00:23.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.032) 0:00:23.304 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.039) 0:00:23.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.035) 0:00:23.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.032) 0:00:23.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.032) 0:00:23.444 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.040) 0:00:23.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.032) 0:00:23.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:32:29 +0000 (0:00:00.033) 0:00:23.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.035) 0:00:23.587 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:23.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.033) 0:00:23.653 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.039) 0:00:23.693 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.037) 0:00:23.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:23.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.033) 0:00:23.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.031) 0:00:23.829 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:23.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.033) 0:00:23.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:23.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.031) 0:00:23.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.034) 0:00:23.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.033) 0:00:24.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:24.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:24.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.035) 0:00:24.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.033) 0:00:24.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.065) 0:00:24.225 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.036) 0:00:24.262 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:24.295 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.033) 0:00:24.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.034) 0:00:24.362 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.032) 0:00:24.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.034) 0:00:24.429 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.081) 0:00:24.511 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:32:30 +0000 (0:00:00.035) 0:00:24.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.031) 0:00:24.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.030) 0:00:24.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.029) 0:00:24.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.029) 0:00:24.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.034) 0:00:24.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.030) 0:00:24.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.030) 0:00:24.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.029) 0:00:24.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.030) 0:00:24.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.029) 0:00:24.853 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.030) 0:00:24.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the partition created above] ************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:43 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.034) 0:00:24.918 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.065) 0:00:24.983 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:32:31 +0000 (0:00:00.047) 0:00:25.031 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.575) 0:00:25.606 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.073) 0:00:25.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.033) 0:00:25.713 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.032) 0:00:25.745 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.061) 0:00:25.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.025) 0:00:25.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.033) 0:00:25.866 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.039) 0:00:25.905 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.034) 0:00:25.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.031) 0:00:25.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.031) 0:00:26.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.032) 0:00:26.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.034) 0:00:26.069 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.064) 0:00:26.133 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:32:32 +0000 (0:00:00.033) 0:00:26.167 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:32:34 +0000 (0:00:01.731) 0:00:27.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:32:34 +0000 (0:00:00.030) 0:00:27.929 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:32:34 +0000 (0:00:00.028) 0:00:27.958 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:32:34 +0000 (0:00:00.046) 0:00:28.005 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:32:34 +0000 (0:00:00.093) 0:00:28.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:32:34 +0000 (0:00:00.036) 0:00:28.135 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=3b87cec2-c2a3-4310-86a3-483aec835b37', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:32:34 +0000 (0:00:00.404) 0:00:28.539 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:32:35 +0000 (0:00:00.701) 0:00:29.240 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:32:35 +0000 (0:00:00.031) 0:00:29.272 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:32:36 +0000 (0:00:00.670) 0:00:29.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:32:36 +0000 (0:00:00.425) 0:00:30.368 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:32:36 +0000 (0:00:00.029) 0:00:30.398 ******** ok: [/cache/rhel-x.qcow2] TASK [Ensure the inherited type is reflected in blivet module output] ********** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:57 Wednesday 01 June 2022 17:32:37 +0000 (0:00:00.865) 0:00:31.264 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:62 Wednesday 01 June 2022 17:32:37 +0000 (0:00:00.037) 0:00:31.301 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:32:37 +0000 (0:00:00.055) 0:00:31.356 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:32:37 +0000 (0:00:00.037) 0:00:31.394 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:32:37 +0000 (0:00:00.031) 0:00:31.425 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:32:38 +0000 (0:00:00.417) 0:00:31.843 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003379", "end": "2022-06-01 13:32:38.024209", "rc": 0, "start": "2022-06-01 13:32:38.020830" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:32:38 +0000 (0:00:00.434) 0:00:32.277 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003664", "end": "2022-06-01 13:32:38.427601", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:32:38.423937" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.401) 0:00:32.679 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.060) 0:00:32.739 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.032) 0:00:32.772 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.062) 0:00:32.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.030) 0:00:32.866 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.029) 0:00:32.895 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.029) 0:00:32.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.030) 0:00:32.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.032) 0:00:32.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.065) 0:00:33.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.029) 0:00:33.083 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.027) 0:00:33.111 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.061) 0:00:33.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.036) 0:00:33.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.032) 0:00:33.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.030) 0:00:33.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.031) 0:00:33.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.033) 0:00:33.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.031) 0:00:33.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.034) 0:00:33.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.032) 0:00:33.436 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.059) 0:00:33.496 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=3b87cec2-c2a3-4310-86a3-483aec835b37', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:32:39 +0000 (0:00:00.043) 0:00:33.539 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.067) 0:00:33.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.038) 0:00:33.645 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.029) 0:00:33.674 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.030) 0:00:33.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.033) 0:00:33.737 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.066) 0:00:33.804 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'ext4', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': 10736369664, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'UUID=3b87cec2-c2a3-4310-86a3-483aec835b37', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_mount_id": "UUID=3b87cec2-c2a3-4310-86a3-483aec835b37", "_raw_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10736369664, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.042) 0:00:33.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.032) 0:00:33.879 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.062) 0:00:33.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.035) 0:00:33.977 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.119) 0:00:34.097 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.038) 0:00:34.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.046) 0:00:34.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.030) 0:00:34.213 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.037) 0:00:34.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.035) 0:00:34.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.032) 0:00:34.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.033) 0:00:34.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.032) 0:00:34.383 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.030) 0:00:34.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.047) 0:00:34.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.029) 0:00:34.491 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.037) 0:00:34.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:32:40 +0000 (0:00:00.032) 0:00:34.561 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.032) 0:00:34.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.032) 0:00:34.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.027) 0:00:34.653 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.401) 0:00:35.054 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.038) 0:00:35.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.027) 0:00:35.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.034) 0:00:35.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.030) 0:00:35.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.025) 0:00:35.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.031) 0:00:35.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.078) 0:00:35.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.035) 0:00:35.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.030) 0:00:35.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.033) 0:00:35.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.032) 0:00:35.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.032) 0:00:35.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.035) 0:00:35.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:32:41 +0000 (0:00:00.033) 0:00:35.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.041) 0:00:35.595 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.038) 0:00:35.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.032) 0:00:35.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.032) 0:00:35.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.034) 0:00:35.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.033) 0:00:35.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.033) 0:00:35.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.035) 0:00:35.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.033) 0:00:35.869 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.032) 0:00:35.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.035) 0:00:35.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.031) 0:00:35.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.031) 0:00:35.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.031) 0:00:36.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.031) 0:00:36.062 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.032) 0:00:36.094 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.037) 0:00:36.131 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.032) 0:00:36.164 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.034) 0:00:36.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.034) 0:00:36.234 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.030) 0:00:36.264 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.031) 0:00:36.296 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.039) 0:00:36.335 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.033) 0:00:36.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.029) 0:00:36.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.032) 0:00:36.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.033) 0:00:36.464 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.034) 0:00:36.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:32:42 +0000 (0:00:00.033) 0:00:36.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.034) 0:00:36.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.031) 0:00:36.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.030) 0:00:36.629 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.034) 0:00:36.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.030) 0:00:36.694 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.030) 0:00:36.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:64 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.031) 0:00:36.756 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.067) 0:00:36.823 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.046) 0:00:36.870 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.542) 0:00:37.413 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.072) 0:00:37.485 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.032) 0:00:37.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:32:43 +0000 (0:00:00.033) 0:00:37.551 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.063) 0:00:37.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.027) 0:00:37.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.033) 0:00:37.676 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "sda", "state": "absent", "type": "partition", "volumes": [ { "mount_point": "/opt/test1", "name": "sda1", "state": "absent" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.083) 0:00:37.760 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.034) 0:00:37.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.031) 0:00:37.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.033) 0:00:37.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.033) 0:00:37.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.031) 0:00:37.923 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.047) 0:00:37.971 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:32:44 +0000 (0:00:00.029) 0:00:38.001 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:32:45 +0000 (0:00:01.270) 0:00:39.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.047) 0:00:39.319 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.030) 0:00:39.349 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.040) 0:00:39.390 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.037) 0:00:39.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.034) 0:00:39.462 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.031) 0:00:39.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.032) 0:00:39.526 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:32:45 +0000 (0:00:00.029) 0:00:39.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:32:46 +0000 (0:00:00.031) 0:00:39.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:32:46 +0000 (0:00:00.398) 0:00:39.986 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:32:46 +0000 (0:00:00.030) 0:00:40.016 ******** ok: [/cache/rhel-x.qcow2] TASK [Ensure the inherited type is reflected in blivet module output] ********** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:78 Wednesday 01 June 2022 17:32:47 +0000 (0:00:00.889) 0:00:40.906 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:83 Wednesday 01 June 2022 17:32:47 +0000 (0:00:00.040) 0:00:40.946 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:32:47 +0000 (0:00:00.070) 0:00:41.017 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "sda", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "partition", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:32:47 +0000 (0:00:00.044) 0:00:41.062 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:32:47 +0000 (0:00:00.033) 0:00:41.095 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:32:47 +0000 (0:00:00.412) 0:00:41.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003582", "end": "2022-06-01 13:32:47.649986", "rc": 0, "start": "2022-06-01 13:32:47.646404" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.395) 0:00:41.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003515", "end": "2022-06-01 13:32:48.050194", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:32:48.046679" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.399) 0:00:42.302 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.064) 0:00:42.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.029) 0:00:42.396 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.064) 0:00:42.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.030) 0:00:42.492 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.031) 0:00:42.523 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:32:48 +0000 (0:00:00.035) 0:00:42.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.033) 0:00:42.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.030) 0:00:42.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.074) 0:00:42.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.031) 0:00:42.729 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.026) 0:00:42.756 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.054) 0:00:42.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.031) 0:00:42.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.034) 0:00:42.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.032) 0:00:42.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.032) 0:00:42.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.035) 0:00:42.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.035) 0:00:43.012 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.032) 0:00:43.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.035) 0:00:43.080 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.060) 0:00:43.141 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.039) 0:00:43.181 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.063) 0:00:43.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.037) 0:00:43.282 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.030) 0:00:43.313 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.029) 0:00:43.342 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.032) 0:00:43.375 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.064) 0:00:43.440 ******** skipping: [/cache/rhel-x.qcow2] => (item={u'_raw_device': u'', u'raid_metadata_version': None, u'raid_disks': [], u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'', u'size': 0, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'absent', u'vdo_pool_size': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'cache_size': 0, u'_mount_id': u'', u'raid_spare_count': None, u'name': u'sda1', u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "sda1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 0, "state": "absent", "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.042) 0:00:43.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:32:49 +0000 (0:00:00.031) 0:00:43.514 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.062) 0:00:43.577 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.040) 0:00:43.618 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.122) 0:00:43.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.037) 0:00:43.778 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.041) 0:00:43.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.029) 0:00:43.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.035) 0:00:43.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.030) 0:00:43.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.029) 0:00:43.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.029) 0:00:43.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.029) 0:00:44.003 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.031) 0:00:44.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.048) 0:00:44.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.028) 0:00:44.112 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.039) 0:00:44.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.030) 0:00:44.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.031) 0:00:44.214 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.030) 0:00:44.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:32:50 +0000 (0:00:00.029) 0:00:44.273 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.407) 0:00:44.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.036) 0:00:44.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.024) 0:00:44.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.031) 0:00:44.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.027) 0:00:44.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.024) 0:00:44.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.030) 0:00:44.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.032) 0:00:44.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.073) 0:00:44.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.025) 0:00:44.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.030) 0:00:45.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.031) 0:00:45.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.030) 0:00:45.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.032) 0:00:45.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.033) 0:00:45.147 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.040) 0:00:45.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.036) 0:00:45.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.033) 0:00:45.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.038) 0:00:45.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.034) 0:00:45.330 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.038) 0:00:45.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.034) 0:00:45.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.034) 0:00:45.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.030) 0:00:45.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.031) 0:00:45.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:32:51 +0000 (0:00:00.030) 0:00:45.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:45.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.033) 0:00:45.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.033) 0:00:45.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.034) 0:00:45.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:45.698 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.034) 0:00:45.732 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.035) 0:00:45.767 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:45.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.034) 0:00:45.834 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.033) 0:00:45.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.033) 0:00:45.900 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.037) 0:00:45.937 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.038) 0:00:45.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:46.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:46.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:46.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.037) 0:00:46.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.037) 0:00:46.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.036) 0:00:46.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:46.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.032) 0:00:46.250 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.031) 0:00:46.281 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.030) 0:00:46.311 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.026) 0:00:46.338 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=229 changed=5 unreachable=0 failed=0 skipped=245 rescued=0 ignored=0 Wednesday 01 June 2022 17:32:52 +0000 (0:00:00.016) 0:00:46.355 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Gathering Facts --------------------------------------------------------- 1.12s /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool_scsi_generated.yml:3 - linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.84s /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml:2 ---------------- linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : set up new/current mounts ------------------ 0.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.58s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:32:53 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:32:54 +0000 (0:00:01.348) 0:00:01.371 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_null_raid_pool.yml ********************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:2 Wednesday 01 June 2022 17:32:54 +0000 (0:00:00.013) 0:00:01.384 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:13 Wednesday 01 June 2022 17:32:56 +0000 (0:00:01.156) 0:00:02.540 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:32:56 +0000 (0:00:00.038) 0:00:02.579 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:32:56 +0000 (0:00:00.165) 0:00:02.744 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:32:56 +0000 (0:00:00.574) 0:00:03.319 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:32:56 +0000 (0:00:00.082) 0:00:03.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:32:56 +0000 (0:00:00.022) 0:00:03.424 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:32:57 +0000 (0:00:00.022) 0:00:03.446 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:32:57 +0000 (0:00:00.195) 0:00:03.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:32:57 +0000 (0:00:00.019) 0:00:03.661 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:32:58 +0000 (0:00:01.098) 0:00:04.759 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:32:58 +0000 (0:00:00.048) 0:00:04.808 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:32:58 +0000 (0:00:00.045) 0:00:04.853 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:32:59 +0000 (0:00:00.732) 0:00:05.585 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:32:59 +0000 (0:00:00.082) 0:00:05.668 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:32:59 +0000 (0:00:00.021) 0:00:05.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:32:59 +0000 (0:00:00.023) 0:00:05.714 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:32:59 +0000 (0:00:00.021) 0:00:05.735 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:33:00 +0000 (0:00:00.840) 0:00:06.575 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:33:02 +0000 (0:00:01.892) 0:00:08.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.044) 0:00:08.513 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.026) 0:00:08.540 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.549) 0:00:09.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.031) 0:00:09.120 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.029) 0:00:09.150 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.033) 0:00:09.184 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.035) 0:00:09.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.033) 0:00:09.253 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.027) 0:00:09.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.029) 0:00:09.310 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.034) 0:00:09.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:33:02 +0000 (0:00:00.032) 0:00:09.376 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:33:03 +0000 (0:00:00.518) 0:00:09.895 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:33:03 +0000 (0:00:00.027) 0:00:09.922 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:16 Wednesday 01 June 2022 17:33:04 +0000 (0:00:00.860) 0:00:10.782 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:23 Wednesday 01 June 2022 17:33:04 +0000 (0:00:00.029) 0:00:10.812 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:33:04 +0000 (0:00:00.042) 0:00:10.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:33:04 +0000 (0:00:00.530) 0:00:11.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:33:04 +0000 (0:00:00.036) 0:00:11.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:33:05 +0000 (0:00:00.031) 0:00:11.453 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [get existing raids (before run)] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:27 Wednesday 01 June 2022 17:33:05 +0000 (0:00:00.033) 0:00:11.487 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/proc/mdstat" ], "delta": "0:00:00.003034", "end": "2022-06-01 13:33:04.873765", "rc": 0, "start": "2022-06-01 13:33:04.870731" } STDOUT: Personalities : [raid0] [raid6] [raid5] [raid4] [raid1] unused devices: TASK [check that raid_level null does not create raid] ************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:32 Wednesday 01 June 2022 17:33:05 +0000 (0:00:00.519) 0:00:12.007 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:33:05 +0000 (0:00:00.054) 0:00:12.062 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:33:05 +0000 (0:00:00.073) 0:00:12.135 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.576) 0:00:12.711 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.068) 0:00:12.780 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.029) 0:00:12.810 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.028) 0:00:12.838 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.057) 0:00:12.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.024) 0:00:12.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.031) 0:00:12.952 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "null", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.036) 0:00:12.988 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.034) 0:00:13.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.031) 0:00:13.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.030) 0:00:13.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.030) 0:00:13.115 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.034) 0:00:13.150 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.045) 0:00:13.196 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:33:06 +0000 (0:00:00.030) 0:00:13.226 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:33:09 +0000 (0:00:02.953) 0:00:16.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:33:09 +0000 (0:00:00.032) 0:00:16.211 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:33:09 +0000 (0:00:00.029) 0:00:16.241 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:33:09 +0000 (0:00:00.044) 0:00:16.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:33:09 +0000 (0:00:00.039) 0:00:16.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:33:09 +0000 (0:00:00.034) 0:00:16.359 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:33:09 +0000 (0:00:00.031) 0:00:16.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:33:10 +0000 (0:00:01.021) 0:00:17.412 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:33:11 +0000 (0:00:00.591) 0:00:18.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:33:12 +0000 (0:00:00.679) 0:00:18.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:33:12 +0000 (0:00:00.405) 0:00:19.089 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:33:12 +0000 (0:00:00.029) 0:00:19.118 ******** ok: [/cache/rhel-x.qcow2] TASK [get existing raids (after run)] ****************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:47 Wednesday 01 June 2022 17:33:13 +0000 (0:00:00.942) 0:00:20.061 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/proc/mdstat" ], "delta": "0:00:00.002970", "end": "2022-06-01 13:33:13.311452", "rc": 0, "start": "2022-06-01 13:33:13.308482" } STDOUT: Personalities : [raid0] [raid6] [raid5] [raid4] [raid1] unused devices: TASK [cleanup] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:53 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.386) 0:00:20.447 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.059) 0:00:20.506 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.045) 0:00:20.552 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.530) 0:00:21.082 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.074) 0:00:21.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.030) 0:00:21.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.029) 0:00:21.216 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.062) 0:00:21.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.025) 0:00:21.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.029) 0:00:21.333 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "null", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.035) 0:00:21.369 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.034) 0:00:21.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:33:14 +0000 (0:00:00.032) 0:00:21.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:33:15 +0000 (0:00:00.032) 0:00:21.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:33:15 +0000 (0:00:00.036) 0:00:21.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:33:15 +0000 (0:00:00.038) 0:00:21.544 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:33:15 +0000 (0:00:00.048) 0:00:21.593 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:33:15 +0000 (0:00:00.034) 0:00:21.627 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:33:18 +0000 (0:00:03.034) 0:00:24.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:33:18 +0000 (0:00:00.074) 0:00:24.736 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:33:18 +0000 (0:00:00.030) 0:00:24.767 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:33:18 +0000 (0:00:00.043) 0:00:24.811 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:33:18 +0000 (0:00:00.040) 0:00:24.851 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:33:18 +0000 (0:00:00.037) 0:00:24.889 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:33:18 +0000 (0:00:00.419) 0:00:25.308 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:33:19 +0000 (0:00:00.703) 0:00:26.012 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:33:19 +0000 (0:00:00.031) 0:00:26.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:33:20 +0000 (0:00:00.685) 0:00:26.728 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:33:20 +0000 (0:00:00.396) 0:00:27.124 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:33:20 +0000 (0:00:00.031) 0:00:27.156 ******** ok: [/cache/rhel-x.qcow2] TASK [compare mdstat results] ************************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:68 Wednesday 01 June 2022 17:33:21 +0000 (0:00:00.872) 0:00:28.028 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=66 changed=4 unreachable=0 failed=0 skipped=35 rescued=0 ignored=0 Wednesday 01 June 2022 17:33:21 +0000 (0:00:00.020) 0:00:28.049 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.95s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.16s /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:2 ----------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.02s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : set up new/current mounts ------------------ 0.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.58s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.57s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Find unused disks in the system ----------------------------------------- 0.53s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:33:22 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:33:23 +0000 (0:00:01.375) 0:00:01.398 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_null_raid_pool_nvme_generated.yml ****************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_null_raid_pool_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:33:23 +0000 (0:00:00.015) 0:00:01.413 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:33:24 +0000 (0:00:00.022) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:33:25 +0000 (0:00:01.367) 0:00:01.390 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_null_raid_pool_scsi_generated.yml ****************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_null_raid_pool_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool_scsi_generated.yml:3 Wednesday 01 June 2022 17:33:25 +0000 (0:00:00.017) 0:00:01.408 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool_scsi_generated.yml:7 Wednesday 01 June 2022 17:33:27 +0000 (0:00:01.108) 0:00:02.517 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:2 Wednesday 01 June 2022 17:33:27 +0000 (0:00:00.025) 0:00:02.542 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:13 Wednesday 01 June 2022 17:33:27 +0000 (0:00:00.873) 0:00:03.416 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:33:27 +0000 (0:00:00.041) 0:00:03.458 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:33:28 +0000 (0:00:00.159) 0:00:03.617 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:33:28 +0000 (0:00:00.552) 0:00:04.170 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:33:28 +0000 (0:00:00.090) 0:00:04.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:33:28 +0000 (0:00:00.023) 0:00:04.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:33:28 +0000 (0:00:00.023) 0:00:04.308 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:33:28 +0000 (0:00:00.200) 0:00:04.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:33:29 +0000 (0:00:00.019) 0:00:04.528 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:33:30 +0000 (0:00:01.089) 0:00:05.617 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:33:30 +0000 (0:00:00.048) 0:00:05.666 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:33:30 +0000 (0:00:00.047) 0:00:05.713 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:33:30 +0000 (0:00:00.711) 0:00:06.424 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:33:30 +0000 (0:00:00.080) 0:00:06.505 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:33:31 +0000 (0:00:00.021) 0:00:06.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:33:31 +0000 (0:00:00.026) 0:00:06.554 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:33:31 +0000 (0:00:00.022) 0:00:06.576 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:33:31 +0000 (0:00:00.832) 0:00:07.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:33:33 +0000 (0:00:01.935) 0:00:09.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:33:33 +0000 (0:00:00.044) 0:00:09.388 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:33:33 +0000 (0:00:00.028) 0:00:09.417 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.562) 0:00:09.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.031) 0:00:10.011 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.027) 0:00:10.038 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.034) 0:00:10.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.033) 0:00:10.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.032) 0:00:10.138 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.028) 0:00:10.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.032) 0:00:10.200 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.029) 0:00:10.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:33:34 +0000 (0:00:00.030) 0:00:10.260 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:33:35 +0000 (0:00:00.490) 0:00:10.750 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:33:35 +0000 (0:00:00.034) 0:00:10.785 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:16 Wednesday 01 June 2022 17:33:36 +0000 (0:00:00.874) 0:00:11.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:23 Wednesday 01 June 2022 17:33:36 +0000 (0:00:00.030) 0:00:11.690 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:33:36 +0000 (0:00:00.043) 0:00:11.733 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:33:36 +0000 (0:00:00.562) 0:00:12.296 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:33:36 +0000 (0:00:00.037) 0:00:12.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:33:36 +0000 (0:00:00.033) 0:00:12.367 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb" ] } TASK [get existing raids (before run)] ***************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:27 Wednesday 01 June 2022 17:33:36 +0000 (0:00:00.035) 0:00:12.402 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/proc/mdstat" ], "delta": "0:00:00.004033", "end": "2022-06-01 13:33:36.753890", "rc": 0, "start": "2022-06-01 13:33:36.749857" } STDOUT: Personalities : [raid0] [raid6] [raid5] [raid4] [raid1] unused devices: TASK [check that raid_level null does not create raid] ************************* task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:32 Wednesday 01 June 2022 17:33:37 +0000 (0:00:00.551) 0:00:12.954 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:33:37 +0000 (0:00:00.091) 0:00:13.045 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:33:37 +0000 (0:00:00.045) 0:00:13.091 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.533) 0:00:13.624 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.070) 0:00:13.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.028) 0:00:13.723 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.029) 0:00:13.753 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.060) 0:00:13.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.028) 0:00:13.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.031) 0:00:13.874 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "null", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.036) 0:00:13.911 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.032) 0:00:13.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.030) 0:00:13.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.029) 0:00:14.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.032) 0:00:14.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.047) 0:00:14.084 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.045) 0:00:14.129 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:33:38 +0000 (0:00:00.028) 0:00:14.158 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:33:41 +0000 (0:00:03.039) 0:00:17.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:33:41 +0000 (0:00:00.034) 0:00:17.231 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:33:41 +0000 (0:00:00.029) 0:00:17.261 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:33:41 +0000 (0:00:00.043) 0:00:17.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:33:41 +0000 (0:00:00.039) 0:00:17.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:33:41 +0000 (0:00:00.034) 0:00:17.378 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:33:41 +0000 (0:00:00.031) 0:00:17.409 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:33:42 +0000 (0:00:01.044) 0:00:18.454 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:33:43 +0000 (0:00:00.586) 0:00:19.040 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:33:44 +0000 (0:00:00.687) 0:00:19.727 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:33:44 +0000 (0:00:00.395) 0:00:20.122 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:33:44 +0000 (0:00:00.027) 0:00:20.150 ******** ok: [/cache/rhel-x.qcow2] TASK [get existing raids (after run)] ****************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:47 Wednesday 01 June 2022 17:33:45 +0000 (0:00:00.887) 0:00:21.037 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/proc/mdstat" ], "delta": "0:00:00.003852", "end": "2022-06-01 13:33:45.227072", "rc": 0, "start": "2022-06-01 13:33:45.223220" } STDOUT: Personalities : [raid0] [raid6] [raid5] [raid4] [raid1] unused devices: TASK [cleanup] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:53 Wednesday 01 June 2022 17:33:45 +0000 (0:00:00.391) 0:00:21.428 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:33:45 +0000 (0:00:00.058) 0:00:21.487 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.046) 0:00:21.534 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.542) 0:00:22.076 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.071) 0:00:22.148 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.038) 0:00:22.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.031) 0:00:22.218 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.067) 0:00:22.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.026) 0:00:22.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.034) 0:00:22.346 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "vg1", "raid_level": "null", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.041) 0:00:22.388 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.039) 0:00:22.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.032) 0:00:22.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:33:46 +0000 (0:00:00.031) 0:00:22.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:33:47 +0000 (0:00:00.032) 0:00:22.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:33:47 +0000 (0:00:00.034) 0:00:22.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:33:47 +0000 (0:00:00.050) 0:00:22.609 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:33:47 +0000 (0:00:00.030) 0:00:22.639 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:33:50 +0000 (0:00:03.240) 0:00:25.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:33:50 +0000 (0:00:00.029) 0:00:25.909 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:33:50 +0000 (0:00:00.028) 0:00:25.937 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:33:50 +0000 (0:00:00.044) 0:00:25.982 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": "null", "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:33:50 +0000 (0:00:00.041) 0:00:26.024 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:33:50 +0000 (0:00:00.037) 0:00:26.061 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:33:50 +0000 (0:00:00.407) 0:00:26.469 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:33:51 +0000 (0:00:00.686) 0:00:27.155 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:33:51 +0000 (0:00:00.030) 0:00:27.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:33:52 +0000 (0:00:00.657) 0:00:27.843 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:33:52 +0000 (0:00:00.403) 0:00:28.247 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:33:52 +0000 (0:00:00.030) 0:00:28.277 ******** ok: [/cache/rhel-x.qcow2] TASK [compare mdstat results] ************************************************** task path: /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:68 Wednesday 01 June 2022 17:33:53 +0000 (0:00:00.898) 0:00:29.176 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=68 changed=4 unreachable=0 failed=0 skipped=35 rescued=0 ignored=0 Wednesday 01 June 2022 17:33:53 +0000 (0:00:00.022) 0:00:29.198 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.24s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_null_raid_pool_scsi_generated.yml:3 -------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.87s /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml:2 ----------------------------- linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.66s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : set up new/current mounts ------------------ 0.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Find unused disks in the system ----------------------------------------- 0.56s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:33:54 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:33:55 +0000 (0:00:01.393) 0:00:01.416 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.39s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_raid_pool_options.yml ****************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:2 Wednesday 01 June 2022 17:33:55 +0000 (0:00:00.015) 0:00:01.432 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:17 Wednesday 01 June 2022 17:33:56 +0000 (0:00:01.106) 0:00:02.538 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:33:57 +0000 (0:00:00.038) 0:00:02.577 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:33:57 +0000 (0:00:00.158) 0:00:02.735 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:33:57 +0000 (0:00:00.547) 0:00:03.282 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:33:57 +0000 (0:00:00.078) 0:00:03.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:33:57 +0000 (0:00:00.024) 0:00:03.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:33:57 +0000 (0:00:00.023) 0:00:03.409 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:33:58 +0000 (0:00:00.215) 0:00:03.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:33:58 +0000 (0:00:00.020) 0:00:03.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:33:59 +0000 (0:00:01.094) 0:00:04.739 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:33:59 +0000 (0:00:00.056) 0:00:04.795 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:33:59 +0000 (0:00:00.047) 0:00:04.842 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:33:59 +0000 (0:00:00.720) 0:00:05.563 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:34:00 +0000 (0:00:00.083) 0:00:05.646 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:34:00 +0000 (0:00:00.021) 0:00:05.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:34:00 +0000 (0:00:00.022) 0:00:05.690 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:34:00 +0000 (0:00:00.020) 0:00:05.711 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:34:01 +0000 (0:00:00.860) 0:00:06.571 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:34:02 +0000 (0:00:01.938) 0:00:08.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:34:02 +0000 (0:00:00.045) 0:00:08.555 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.027) 0:00:08.583 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.534) 0:00:09.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.031) 0:00:09.149 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.028) 0:00:09.177 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.035) 0:00:09.213 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.033) 0:00:09.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.034) 0:00:09.280 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.028) 0:00:09.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.032) 0:00:09.341 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.029) 0:00:09.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:34:03 +0000 (0:00:00.031) 0:00:09.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:34:04 +0000 (0:00:00.505) 0:00:09.907 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:34:04 +0000 (0:00:00.028) 0:00:09.935 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:20 Wednesday 01 June 2022 17:34:05 +0000 (0:00:00.878) 0:00:10.814 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:27 Wednesday 01 June 2022 17:34:05 +0000 (0:00:00.031) 0:00:10.845 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:34:05 +0000 (0:00:00.044) 0:00:10.890 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb", "sdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:34:05 +0000 (0:00:00.542) 0:00:11.432 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:34:05 +0000 (0:00:00.036) 0:00:11.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:34:05 +0000 (0:00:00.030) 0:00:11.500 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb", "sdc" ] } TASK [Create a RAID1 device] *************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:32 Wednesday 01 June 2022 17:34:05 +0000 (0:00:00.034) 0:00:11.535 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.053) 0:00:11.589 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.044) 0:00:11.633 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.528) 0:00:12.162 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.073) 0:00:12.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.031) 0:00:12.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.032) 0:00:12.299 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.061) 0:00:12.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.028) 0:00:12.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.031) 0:00:12.419 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.041) 0:00:12.461 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.034) 0:00:12.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.031) 0:00:12.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:34:06 +0000 (0:00:00.029) 0:00:12.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:34:07 +0000 (0:00:00.028) 0:00:12.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:34:07 +0000 (0:00:00.030) 0:00:12.615 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:34:07 +0000 (0:00:00.045) 0:00:12.661 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:34:07 +0000 (0:00:00.026) 0:00:12.687 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:34:16 +0000 (0:00:09.519) 0:00:22.207 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:34:16 +0000 (0:00:00.032) 0:00:22.239 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:34:16 +0000 (0:00:00.030) 0:00:22.270 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:34:16 +0000 (0:00:00.046) 0:00:22.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:34:16 +0000 (0:00:00.043) 0:00:22.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:34:16 +0000 (0:00:00.039) 0:00:22.400 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:34:16 +0000 (0:00:00.031) 0:00:22.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:34:17 +0000 (0:00:01.035) 0:00:23.467 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:34:19 +0000 (0:00:01.349) 0:00:24.817 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:34:19 +0000 (0:00:00.728) 0:00:25.545 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:34:20 +0000 (0:00:00.412) 0:00:25.958 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:34:20 +0000 (0:00:00.030) 0:00:25.989 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:56 Wednesday 01 June 2022 17:34:21 +0000 (0:00:00.977) 0:00:26.967 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:34:21 +0000 (0:00:00.056) 0:00:27.023 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:34:21 +0000 (0:00:00.051) 0:00:27.074 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:34:21 +0000 (0:00:00.036) 0:00:27.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "96b106fb-b11d-4bd3-b358-3a25d1450cc0" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "f8b082cb-056b-4da7-93d0-1f28070a1193" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "281fae15-ae6c-49d0-8032-182d9732330f" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "10G", "type": "raid1", "uuid": "NO99h4-jEV3-4F7A-IEDh-KsNa-Hael-AIf0Dn" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "95300f00-6f60-c2bd-2965-11492f36ea38" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "95300f00-6f60-c2bd-2965-11492f36ea38" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "95300f00-6f60-c2bd-2965-11492f36ea38" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:34:22 +0000 (0:00:00.527) 0:00:27.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003203", "end": "2022-06-01 13:34:21.874945", "rc": 0, "start": "2022-06-01 13:34:21.871742" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:34:22 +0000 (0:00:00.513) 0:00:28.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003102", "end": "2022-06-01 13:34:22.283069", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:34:22.279967" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:34:22 +0000 (0:00:00.397) 0:00:28.550 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.080) 0:00:28.630 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.034) 0:00:28.664 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.068) 0:00:28.732 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.043) 0:00:28.776 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.568) 0:00:29.345 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.045) 0:00:29.391 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.042) 0:00:29.433 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.040) 0:00:29.474 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:34:23 +0000 (0:00:00.058) 0:00:29.532 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid1" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.058) 0:00:29.590 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.088) 0:00:29.679 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.064) 0:00:29.743 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.008438", "end": "2022-06-01 13:34:23.884535", "rc": 0, "start": "2022-06-01 13:34:23.876097" } STDOUT: /dev/md/vg1-1: Version : 1.0 Creation Time : Wed Jun 1 13:34:09 2022 Raid Level : raid1 Array Size : 10484608 (10.00 GiB 10.74 GB) Used Dev Size : 10484608 (10.00 GiB 10.74 GB) Raid Devices : 2 Total Devices : 3 Persistence : Superblock is persistent Intent Bitmap : Internal Update Time : Wed Jun 1 13:34:22 2022 State : clean, resyncing Active Devices : 2 Working Devices : 3 Failed Devices : 0 Spare Devices : 1 Consistency Policy : bitmap Resync Status : 26% complete Name : vg1-1 UUID : 95300f00:6f60c2bd:29651149:2f36ea38 Events : 11 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 2 8 33 - spare /dev/sdc1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.410) 0:00:30.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.045) 0:00:30.199 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 1\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.043) 0:00:30.243 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.0\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.042) 0:00:30.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.044) 0:00:30.330 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.046) 0:00:30.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.045) 0:00:30.422 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.032) 0:00:30.454 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:34:24 +0000 (0:00:00.060) 0:00:30.515 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.095) 0:00:30.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.033) 0:00:30.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.033) 0:00:30.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.034) 0:00:30.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.039) 0:00:30.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.037) 0:00:30.790 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.035) 0:00:30.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.034) 0:00:30.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.034) 0:00:30.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.033) 0:00:30.927 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.063) 0:00:30.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.039) 0:00:31.030 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.040) 0:00:31.071 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.063) 0:00:31.134 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.038) 0:00:31.173 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.039) 0:00:31.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.032) 0:00:31.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.037) 0:00:31.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.037) 0:00:31.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.035) 0:00:31.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.036) 0:00:31.391 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.071) 0:00:31.463 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:34:25 +0000 (0:00:00.100) 0:00:31.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.035) 0:00:31.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.033) 0:00:31.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:31.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.031) 0:00:31.699 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.031) 0:00:31.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.031) 0:00:31.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.083) 0:00:31.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:31.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.038) 0:00:31.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.033) 0:00:31.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:31.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:32.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.036) 0:00:32.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.032) 0:00:32.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:32.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.035) 0:00:32.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.032) 0:00:32.192 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.032) 0:00:32.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.037) 0:00:32.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.035) 0:00:32.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.033) 0:00:32.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.032) 0:00:32.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:32.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:32.432 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.034) 0:00:32.466 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:34:26 +0000 (0:00:00.088) 0:00:32.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.044) 0:00:32.599 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.130) 0:00:32.729 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.038) 0:00:32.768 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "96b106fb-b11d-4bd3-b358-3a25d1450cc0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "96b106fb-b11d-4bd3-b358-3a25d1450cc0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.044) 0:00:32.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.044) 0:00:32.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.039) 0:00:32.896 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.041) 0:00:32.938 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.034) 0:00:32.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.033) 0:00:33.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.031) 0:00:33.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.035) 0:00:33.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.050) 0:00:33.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.036) 0:00:33.159 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.037) 0:00:33.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.030) 0:00:33.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.037) 0:00:33.264 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.041) 0:00:33.305 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:34:27 +0000 (0:00:00.044) 0:00:33.350 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104855.8431215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104855.8431215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22282, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104855.8431215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.427) 0:00:33.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.046) 0:00:33.824 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.043) 0:00:33.868 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.036) 0:00:33.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.032) 0:00:33.937 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.039) 0:00:33.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.033) 0:00:34.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.031) 0:00:34.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.082) 0:00:34.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.039) 0:00:34.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.032) 0:00:34.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.032) 0:00:34.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.034) 0:00:34.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.038) 0:00:34.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.035) 0:00:34.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.040) 0:00:34.376 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.039) 0:00:34.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.034) 0:00:34.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.034) 0:00:34.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.038) 0:00:34.523 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:34:28 +0000 (0:00:00.034) 0:00:34.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.034) 0:00:34.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.036) 0:00:34.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.034) 0:00:34.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.034) 0:00:34.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.041) 0:00:34.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.037) 0:00:34.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.036) 0:00:34.813 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:34:29 +0000 (0:00:00.501) 0:00:35.315 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.404) 0:00:35.720 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.042) 0:00:35.762 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.042) 0:00:35.804 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.033) 0:00:35.838 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.032) 0:00:35.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.037) 0:00:35.907 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.038) 0:00:35.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.038) 0:00:35.985 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.038) 0:00:36.023 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.036) 0:00:36.059 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.043) 0:00:36.103 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.035396", "end": "2022-06-01 13:34:30.271283", "rc": 0, "start": "2022-06-01 13:34:30.235887" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:34:30 +0000 (0:00:00.432) 0:00:36.536 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.044) 0:00:36.580 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.044) 0:00:36.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.034) 0:00:36.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.034) 0:00:36.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.034) 0:00:36.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.036) 0:00:36.766 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.039) 0:00:36.805 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.039) 0:00:36.845 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.174) 0:00:37.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.040) 0:00:37.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "f8b082cb-056b-4da7-93d0-1f28070a1193" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "f8b082cb-056b-4da7-93d0-1f28070a1193" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.044) 0:00:37.104 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.042) 0:00:37.147 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.038) 0:00:37.185 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.043) 0:00:37.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.035) 0:00:37.263 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.034) 0:00:37.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.035) 0:00:37.334 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.039) 0:00:37.374 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.050) 0:00:37.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.036) 0:00:37.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.039) 0:00:37.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:34:31 +0000 (0:00:00.033) 0:00:37.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.036) 0:00:37.570 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.038) 0:00:37.609 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.037) 0:00:37.647 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104855.5801215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104855.5801215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22248, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104855.5801215, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.416) 0:00:38.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.047) 0:00:38.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.047) 0:00:38.158 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.045) 0:00:38.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.035) 0:00:38.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.042) 0:00:38.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.034) 0:00:38.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.036) 0:00:38.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.036) 0:00:38.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.045) 0:00:38.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.038) 0:00:38.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.037) 0:00:38.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:34:32 +0000 (0:00:00.035) 0:00:38.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.034) 0:00:38.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.036) 0:00:38.616 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.039) 0:00:38.656 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.043) 0:00:38.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.035) 0:00:38.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.034) 0:00:38.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.035) 0:00:38.806 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.041) 0:00:38.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.034) 0:00:38.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.035) 0:00:38.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.034) 0:00:38.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.035) 0:00:38.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.035) 0:00:39.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.038) 0:00:39.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.035) 0:00:39.097 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:34:33 +0000 (0:00:00.419) 0:00:39.516 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.425) 0:00:39.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.039) 0:00:39.982 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.037) 0:00:40.020 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.034) 0:00:40.054 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.034) 0:00:40.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.031) 0:00:40.120 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.033) 0:00:40.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.033) 0:00:40.187 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.039) 0:00:40.227 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.036) 0:00:40.263 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:34:34 +0000 (0:00:00.044) 0:00:40.308 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.044288", "end": "2022-06-01 13:34:34.490459", "rc": 0, "start": "2022-06-01 13:34:34.446171" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.447) 0:00:40.756 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.039) 0:00:40.796 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.047) 0:00:40.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.036) 0:00:40.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.036) 0:00:40.917 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.037) 0:00:40.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.037) 0:00:40.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.041) 0:00:41.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.047) 0:00:41.082 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.135) 0:00:41.218 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.043) 0:00:41.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "281fae15-ae6c-49d0-8032-182d9732330f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "281fae15-ae6c-49d0-8032-182d9732330f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.045) 0:00:41.307 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.041) 0:00:41.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.037) 0:00:41.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.042) 0:00:41.428 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.034) 0:00:41.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.033) 0:00:41.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.032) 0:00:41.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:34:35 +0000 (0:00:00.034) 0:00:41.563 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.052) 0:00:41.615 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.038) 0:00:41.654 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.038) 0:00:41.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.032) 0:00:41.725 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.034) 0:00:41.759 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.042) 0:00:41.802 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.043) 0:00:41.846 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104855.2811215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104855.2811215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22213, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104855.2811215, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.437) 0:00:42.284 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.053) 0:00:42.337 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.059) 0:00:42.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.037) 0:00:42.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.032) 0:00:42.466 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:34:36 +0000 (0:00:00.041) 0:00:42.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.083) 0:00:42.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.034) 0:00:42.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.032) 0:00:42.658 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.038) 0:00:42.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.033) 0:00:42.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.036) 0:00:42.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.033) 0:00:42.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.032) 0:00:42.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.033) 0:00:42.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.042) 0:00:42.908 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.039) 0:00:42.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.036) 0:00:42.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.034) 0:00:43.018 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.034) 0:00:43.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.034) 0:00:43.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.032) 0:00:43.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.031) 0:00:43.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.036) 0:00:43.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.034) 0:00:43.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.034) 0:00:43.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.033) 0:00:43.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:34:37 +0000 (0:00:00.034) 0:00:43.325 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.430) 0:00:43.755 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.434) 0:00:44.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.041) 0:00:44.232 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.037) 0:00:44.269 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.033) 0:00:44.303 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.033) 0:00:44.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.033) 0:00:44.371 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.034) 0:00:44.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.037) 0:00:44.443 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.039) 0:00:44.483 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.038) 0:00:44.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:34:38 +0000 (0:00:00.045) 0:00:44.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.040674", "end": "2022-06-01 13:34:38.763029", "rc": 0, "start": "2022-06-01 13:34:38.722355" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.470) 0:00:45.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.045) 0:00:45.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.044) 0:00:45.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.035) 0:00:45.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.036) 0:00:45.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.034) 0:00:45.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.034) 0:00:45.270 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.035) 0:00:45.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.033) 0:00:45.340 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.030) 0:00:45.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation minus the pool raid options] ************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:58 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.031) 0:00:45.402 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.106) 0:00:45.509 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:34:39 +0000 (0:00:00.049) 0:00:45.559 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.582) 0:00:46.141 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.079) 0:00:46.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.033) 0:00:46.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.034) 0:00:46.288 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.069) 0:00:46.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.028) 0:00:46.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.033) 0:00:46.419 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.044) 0:00:46.464 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.036) 0:00:46.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.034) 0:00:46.536 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:34:40 +0000 (0:00:00.031) 0:00:46.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:34:41 +0000 (0:00:00.033) 0:00:46.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:34:41 +0000 (0:00:00.034) 0:00:46.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:34:41 +0000 (0:00:00.048) 0:00:46.685 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:34:41 +0000 (0:00:00.031) 0:00:46.717 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "mdadm", "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:34:43 +0000 (0:00:02.342) 0:00:49.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:34:43 +0000 (0:00:00.032) 0:00:49.091 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:34:43 +0000 (0:00:00.030) 0:00:49.121 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "mdadm", "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:34:43 +0000 (0:00:00.047) 0:00:49.169 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:34:43 +0000 (0:00:00.043) 0:00:49.212 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:34:43 +0000 (0:00:00.038) 0:00:49.251 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:34:43 +0000 (0:00:00.032) 0:00:49.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:34:44 +0000 (0:00:00.717) 0:00:50.001 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:34:45 +0000 (0:00:01.140) 0:00:51.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:34:46 +0000 (0:00:00.687) 0:00:51.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:34:46 +0000 (0:00:00.426) 0:00:52.256 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:34:46 +0000 (0:00:00.032) 0:00:52.289 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert to preserve RAID settings for preexisting pool] ******************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:78 Wednesday 01 June 2022 17:34:47 +0000 (0:00:00.909) 0:00:53.198 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:87 Wednesday 01 June 2022 17:34:47 +0000 (0:00:00.041) 0:00:53.240 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:34:47 +0000 (0:00:00.060) 0:00:53.300 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:34:47 +0000 (0:00:00.049) 0:00:53.350 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:34:47 +0000 (0:00:00.033) 0:00:53.384 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "96b106fb-b11d-4bd3-b358-3a25d1450cc0" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "f8b082cb-056b-4da7-93d0-1f28070a1193" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "281fae15-ae6c-49d0-8032-182d9732330f" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "10G", "type": "raid1", "uuid": "NO99h4-jEV3-4F7A-IEDh-KsNa-Hael-AIf0Dn" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "95300f00-6f60-c2bd-2965-11492f36ea38" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "95300f00-6f60-c2bd-2965-11492f36ea38" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "95300f00-6f60-c2bd-2965-11492f36ea38" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:34:48 +0000 (0:00:00.421) 0:00:53.805 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003660", "end": "2022-06-01 13:34:47.934625", "rc": 0, "start": "2022-06-01 13:34:47.930965" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:34:48 +0000 (0:00:00.400) 0:00:54.205 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003231", "end": "2022-06-01 13:34:48.338764", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:34:48.335533" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.399) 0:00:54.605 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.084) 0:00:54.689 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.034) 0:00:54.724 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.078) 0:00:54.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.050) 0:00:54.853 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.483) 0:00:55.337 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.051) 0:00:55.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.048) 0:00:55.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.039) 0:00:55.476 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.038) 0:00:55.515 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid1" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:34:49 +0000 (0:00:00.040) 0:00:55.555 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.048) 0:00:55.604 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.059) 0:00:55.664 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.011113", "end": "2022-06-01 13:34:49.827970", "rc": 0, "start": "2022-06-01 13:34:49.816857" } STDOUT: /dev/md/vg1-1: Version : 1.0 Creation Time : Wed Jun 1 13:34:09 2022 Raid Level : raid1 Array Size : 10484608 (10.00 GiB 10.74 GB) Used Dev Size : 10484608 (10.00 GiB 10.74 GB) Raid Devices : 2 Total Devices : 3 Persistence : Superblock is persistent Intent Bitmap : Internal Update Time : Wed Jun 1 13:34:49 2022 State : clean, resyncing Active Devices : 2 Working Devices : 3 Failed Devices : 0 Spare Devices : 1 Consistency Policy : bitmap Resync Status : 78% complete Name : vg1-1 UUID : 95300f00:6f60c2bd:29651149:2f36ea38 Events : 21 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 2 8 33 - spare /dev/sdc1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.431) 0:00:56.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.045) 0:00:56.140 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 1\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.042) 0:00:56.182 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.0\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.044) 0:00:56.227 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.047) 0:00:56.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.048) 0:00:56.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.045) 0:00:56.368 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.034) 0:00:56.403 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:34:50 +0000 (0:00:00.068) 0:00:56.472 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.102) 0:00:56.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.036) 0:00:56.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.037) 0:00:56.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.034) 0:00:56.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.035) 0:00:56.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.033) 0:00:56.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.034) 0:00:56.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.034) 0:00:56.821 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.037) 0:00:56.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.034) 0:00:56.893 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.067) 0:00:56.960 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.041) 0:00:57.001 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.038) 0:00:57.040 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.062) 0:00:57.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.039) 0:00:57.142 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.040) 0:00:57.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.038) 0:00:57.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.034) 0:00:57.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.034) 0:00:57.289 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.035) 0:00:57.325 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.034) 0:00:57.360 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:34:51 +0000 (0:00:00.069) 0:00:57.429 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.148) 0:00:57.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.035) 0:00:57.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.033) 0:00:57.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.034) 0:00:57.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.034) 0:00:57.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.036) 0:00:57.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.033) 0:00:57.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.032) 0:00:57.819 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.033) 0:00:57.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.035) 0:00:57.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.035) 0:00:57.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.039) 0:00:57.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.039) 0:00:58.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.035) 0:00:58.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.033) 0:00:58.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.036) 0:00:58.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.040) 0:00:58.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.039) 0:00:58.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.034) 0:00:58.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.036) 0:00:58.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.035) 0:00:58.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.035) 0:00:58.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.034) 0:00:58.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.037) 0:00:58.402 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.036) 0:00:58.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:34:52 +0000 (0:00:00.039) 0:00:58.478 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.106) 0:00:58.584 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.046) 0:00:58.631 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.137) 0:00:58.768 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.039) 0:00:58.808 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "96b106fb-b11d-4bd3-b358-3a25d1450cc0" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "96b106fb-b11d-4bd3-b358-3a25d1450cc0" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.048) 0:00:58.856 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.043) 0:00:58.899 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.039) 0:00:58.939 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.042) 0:00:58.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.034) 0:00:59.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.033) 0:00:59.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.036) 0:00:59.087 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.033) 0:00:59.120 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.048) 0:00:59.169 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.035) 0:00:59.204 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.039) 0:00:59.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.032) 0:00:59.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.034) 0:00:59.312 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.042) 0:00:59.354 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:34:53 +0000 (0:00:00.039) 0:00:59.394 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104855.8431215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104855.8431215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22282, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104855.8431215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.407) 0:00:59.802 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.044) 0:00:59.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.092) 0:00:59.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.038) 0:00:59.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.035) 0:01:00.013 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.040) 0:01:00.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.036) 0:01:00.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.035) 0:01:00.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.033) 0:01:00.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.040) 0:01:00.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.033) 0:01:00.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.034) 0:01:00.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.037) 0:01:00.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.034) 0:01:00.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.035) 0:01:00.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.042) 0:01:00.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.039) 0:01:00.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.032) 0:01:00.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.033) 0:01:00.523 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:34:54 +0000 (0:00:00.030) 0:01:00.554 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.034) 0:01:00.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.040) 0:01:00.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.034) 0:01:00.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.034) 0:01:00.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.037) 0:01:00.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.036) 0:01:00.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.037) 0:01:00.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.036) 0:01:00.846 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:34:55 +0000 (0:00:00.408) 0:01:01.254 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.397) 0:01:01.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.041) 0:01:01.693 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.039) 0:01:01.733 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.034) 0:01:01.768 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.037) 0:01:01.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.036) 0:01:01.841 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.034) 0:01:01.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.037) 0:01:01.913 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.043) 0:01:01.957 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.038) 0:01:01.996 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.046) 0:01:02.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.035557", "end": "2022-06-01 13:34:56.225927", "rc": 0, "start": "2022-06-01 13:34:56.190370" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.452) 0:01:02.494 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:34:56 +0000 (0:00:00.044) 0:01:02.539 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.044) 0:01:02.583 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.036) 0:01:02.620 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.037) 0:01:02.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.035) 0:01:02.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.035) 0:01:02.728 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.032) 0:01:02.760 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.039) 0:01:02.799 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.179) 0:01:02.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.039) 0:01:03.019 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "f8b082cb-056b-4da7-93d0-1f28070a1193" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "f8b082cb-056b-4da7-93d0-1f28070a1193" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.045) 0:01:03.064 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.040) 0:01:03.105 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.040) 0:01:03.145 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.040) 0:01:03.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.032) 0:01:03.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.032) 0:01:03.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.031) 0:01:03.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.032) 0:01:03.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.053) 0:01:03.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.038) 0:01:03.408 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.043) 0:01:03.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.036) 0:01:03.489 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:34:57 +0000 (0:00:00.041) 0:01:03.531 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.050) 0:01:03.581 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.041) 0:01:03.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104855.5801215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104855.5801215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22248, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104855.5801215, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.418) 0:01:04.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.043) 0:01:04.084 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.041) 0:01:04.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.038) 0:01:04.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.033) 0:01:04.198 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.039) 0:01:04.237 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.032) 0:01:04.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.032) 0:01:04.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.032) 0:01:04.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.041) 0:01:04.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.031) 0:01:04.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.033) 0:01:04.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.033) 0:01:04.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.035) 0:01:04.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:34:58 +0000 (0:00:00.034) 0:01:04.543 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.043) 0:01:04.587 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.038) 0:01:04.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.033) 0:01:04.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.033) 0:01:04.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.032) 0:01:04.725 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.036) 0:01:04.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.033) 0:01:04.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.033) 0:01:04.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.033) 0:01:04.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.031) 0:01:04.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.033) 0:01:04.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.036) 0:01:04.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.035) 0:01:04.999 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:34:59 +0000 (0:00:00.455) 0:01:05.454 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.409) 0:01:05.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.041) 0:01:05.904 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.035) 0:01:05.940 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.033) 0:01:05.973 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.034) 0:01:06.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.037) 0:01:06.046 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.034) 0:01:06.080 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.034) 0:01:06.115 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.039) 0:01:06.154 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.036) 0:01:06.190 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:35:00 +0000 (0:00:00.048) 0:01:06.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.037753", "end": "2022-06-01 13:35:00.414660", "rc": 0, "start": "2022-06-01 13:35:00.376907" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.448) 0:01:06.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.047) 0:01:06.735 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.057) 0:01:06.792 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.038) 0:01:06.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.035) 0:01:06.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.040) 0:01:06.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.035) 0:01:06.943 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.035) 0:01:06.979 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.041) 0:01:07.020 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.133) 0:01:07.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.044) 0:01:07.198 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "281fae15-ae6c-49d0-8032-182d9732330f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "281fae15-ae6c-49d0-8032-182d9732330f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.047) 0:01:07.245 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.042) 0:01:07.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.039) 0:01:07.327 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.055) 0:01:07.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.040) 0:01:07.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.034) 0:01:07.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.033) 0:01:07.490 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:35:01 +0000 (0:00:00.036) 0:01:07.527 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.053) 0:01:07.580 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.038) 0:01:07.619 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.037) 0:01:07.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.031) 0:01:07.688 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.031) 0:01:07.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.038) 0:01:07.758 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.039) 0:01:07.798 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104855.2811215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104855.2811215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22213, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104855.2811215, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.418) 0:01:08.216 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.092) 0:01:08.309 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.041) 0:01:08.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.036) 0:01:08.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.035) 0:01:08.422 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.044) 0:01:08.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.038) 0:01:08.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:35:02 +0000 (0:00:00.036) 0:01:08.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.034) 0:01:08.577 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.040) 0:01:08.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.034) 0:01:08.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.034) 0:01:08.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.036) 0:01:08.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.033) 0:01:08.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.032) 0:01:08.790 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.039) 0:01:08.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.038) 0:01:08.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.035) 0:01:08.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.035) 0:01:08.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.032) 0:01:08.971 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.033) 0:01:09.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.036) 0:01:09.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.034) 0:01:09.076 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.032) 0:01:09.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.036) 0:01:09.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.036) 0:01:09.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.035) 0:01:09.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:35:03 +0000 (0:00:00.034) 0:01:09.250 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.400) 0:01:09.651 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.392) 0:01:10.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.039) 0:01:10.083 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.037) 0:01:10.120 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.033) 0:01:10.154 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.032) 0:01:10.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.034) 0:01:10.221 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.035) 0:01:10.256 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.034) 0:01:10.291 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.037) 0:01:10.329 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.039) 0:01:10.368 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:35:04 +0000 (0:00:00.045) 0:01:10.414 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.040211", "end": "2022-06-01 13:35:04.601898", "rc": 0, "start": "2022-06-01 13:35:04.561687" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.459) 0:01:10.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.042) 0:01:10.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.041) 0:01:10.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.043) 0:01:11.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.050) 0:01:11.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.039) 0:01:11.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.039) 0:01:11.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.036) 0:01:11.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.038) 0:01:11.206 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.081) 0:01:11.288 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the pool created above] ******************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:89 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.034) 0:01:11.322 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.078) 0:01:11.401 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:35:05 +0000 (0:00:00.053) 0:01:11.455 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.572) 0:01:12.028 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.075) 0:01:12.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.032) 0:01:12.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.032) 0:01:12.168 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.067) 0:01:12.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.031) 0:01:12.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.034) 0:01:12.301 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.044) 0:01:12.345 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.038) 0:01:12.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.036) 0:01:12.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.037) 0:01:12.457 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.033) 0:01:12.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:35:06 +0000 (0:00:00.034) 0:01:12.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:35:07 +0000 (0:00:00.048) 0:01:12.574 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:35:07 +0000 (0:00:00.029) 0:01:12.604 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:35:11 +0000 (0:00:04.243) 0:01:16.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:35:11 +0000 (0:00:00.035) 0:01:16.883 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:35:11 +0000 (0:00:00.032) 0:01:16.916 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:35:11 +0000 (0:00:00.059) 0:01:16.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:35:11 +0000 (0:00:00.048) 0:01:17.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:35:11 +0000 (0:00:00.042) 0:01:17.065 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:35:12 +0000 (0:00:01.175) 0:01:18.241 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:35:13 +0000 (0:00:00.727) 0:01:18.968 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:35:13 +0000 (0:00:00.034) 0:01:19.002 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:35:14 +0000 (0:00:00.704) 0:01:19.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:35:14 +0000 (0:00:00.412) 0:01:20.119 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:35:14 +0000 (0:00:00.036) 0:01:20.155 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:113 Wednesday 01 June 2022 17:35:15 +0000 (0:00:00.898) 0:01:21.053 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:35:15 +0000 (0:00:00.058) 0:01:21.112 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:35:15 +0000 (0:00:00.050) 0:01:21.162 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:35:15 +0000 (0:00:00.032) 0:01:21.195 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:35:16 +0000 (0:00:00.391) 0:01:21.586 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002778", "end": "2022-06-01 13:35:15.704241", "rc": 0, "start": "2022-06-01 13:35:15.701463" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:35:16 +0000 (0:00:00.380) 0:01:21.967 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002706", "end": "2022-06-01 13:35:16.095152", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:35:16.092446" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:35:16 +0000 (0:00:00.396) 0:01:22.363 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:35:16 +0000 (0:00:00.081) 0:01:22.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:35:16 +0000 (0:00:00.033) 0:01:22.478 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.108) 0:01:22.586 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.042) 0:01:22.628 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.027) 0:01:22.656 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.027) 0:01:22.684 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.043) 0:01:22.727 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.041) 0:01:22.768 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.038) 0:01:22.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid1" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.037) 0:01:22.845 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.029) 0:01:22.874 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.056) 0:01:22.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.034) 0:01:22.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.033) 0:01:22.999 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.032) 0:01:23.032 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.032) 0:01:23.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.031) 0:01:23.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.032) 0:01:23.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.033) 0:01:23.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.032) 0:01:23.194 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.067) 0:01:23.262 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.090) 0:01:23.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.032) 0:01:23.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.032) 0:01:23.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.036) 0:01:23.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.036) 0:01:23.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.037) 0:01:23.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:35:17 +0000 (0:00:00.032) 0:01:23.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.030) 0:01:23.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.029) 0:01:23.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.029) 0:01:23.651 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.065) 0:01:23.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.038) 0:01:23.754 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.029) 0:01:23.784 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.027) 0:01:23.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.030) 0:01:23.842 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.066) 0:01:23.908 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.099) 0:01:24.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.033) 0:01:24.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.031) 0:01:24.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.031) 0:01:24.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.031) 0:01:24.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.033) 0:01:24.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.035) 0:01:24.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.038) 0:01:24.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.033) 0:01:24.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:35:18 +0000 (0:00:00.032) 0:01:24.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.033) 0:01:24.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.035) 0:01:24.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.081) 0:01:24.688 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.033) 0:01:24.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.032) 0:01:24.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.031) 0:01:24.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.030) 0:01:24.816 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.033) 0:01:24.850 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.035) 0:01:24.886 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.080) 0:01:24.966 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.036) 0:01:25.003 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.121) 0:01:25.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.038) 0:01:25.163 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.039) 0:01:25.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.029) 0:01:25.232 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.033) 0:01:25.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.029) 0:01:25.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.028) 0:01:25.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.030) 0:01:25.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.028) 0:01:25.384 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.033) 0:01:25.417 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.052) 0:01:25.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.027) 0:01:25.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:35:19 +0000 (0:00:00.038) 0:01:25.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.040) 0:01:25.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.034) 0:01:25.611 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.033) 0:01:25.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.025) 0:01:25.669 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.406) 0:01:26.076 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.040) 0:01:26.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.028) 0:01:26.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.038) 0:01:26.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.032) 0:01:26.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.028) 0:01:26.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.035) 0:01:26.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.032) 0:01:26.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.032) 0:01:26.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.030) 0:01:26.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.031) 0:01:26.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.029) 0:01:26.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.029) 0:01:26.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.029) 0:01:26.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:35:20 +0000 (0:00:00.031) 0:01:26.528 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.042) 0:01:26.571 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.036) 0:01:26.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.030) 0:01:26.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:26.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:26.701 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.030) 0:01:26.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:26.763 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.029) 0:01:26.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.032) 0:01:26.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.029) 0:01:26.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.029) 0:01:26.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.028) 0:01:26.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.033) 0:01:26.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.105) 0:01:27.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.033) 0:01:27.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:27.118 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:27.150 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.032) 0:01:27.183 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.032) 0:01:27.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.034) 0:01:27.250 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.033) 0:01:27.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:27.314 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.036) 0:01:27.350 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.033) 0:01:27.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.029) 0:01:27.413 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:27.444 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.031) 0:01:27.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.032) 0:01:27.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:35:21 +0000 (0:00:00.034) 0:01:27.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.031) 0:01:27.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.031) 0:01:27.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.034) 0:01:27.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.032) 0:01:27.674 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.037) 0:01:27.712 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.122) 0:01:27.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.037) 0:01:27.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.040) 0:01:27.913 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.034) 0:01:27.947 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.040) 0:01:27.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.034) 0:01:28.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.028) 0:01:28.052 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.031) 0:01:28.083 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.034) 0:01:28.117 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.037) 0:01:28.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.049) 0:01:28.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.028) 0:01:28.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.039) 0:01:28.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.031) 0:01:28.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.032) 0:01:28.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.032) 0:01:28.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:35:22 +0000 (0:00:00.025) 0:01:28.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.409) 0:01:28.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.038) 0:01:28.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.027) 0:01:28.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.033) 0:01:28.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.028) 0:01:28.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.027) 0:01:28.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.028) 0:01:28.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.031) 0:01:29.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.030) 0:01:29.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.025) 0:01:29.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.031) 0:01:29.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.033) 0:01:29.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.029) 0:01:29.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.029) 0:01:29.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.029) 0:01:29.228 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.039) 0:01:29.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.039) 0:01:29.307 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.035) 0:01:29.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.096) 0:01:29.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.036) 0:01:29.476 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.034) 0:01:29.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:35:23 +0000 (0:00:00.033) 0:01:29.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.035) 0:01:29.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.033) 0:01:29.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.035) 0:01:29.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.033) 0:01:29.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.030) 0:01:29.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.030) 0:01:29.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.029) 0:01:29.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.032) 0:01:29.805 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.035) 0:01:29.841 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.034) 0:01:29.875 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.029) 0:01:29.905 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.031) 0:01:29.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.034) 0:01:29.971 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.033) 0:01:30.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.036) 0:01:30.041 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.034) 0:01:30.075 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.033) 0:01:30.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.030) 0:01:30.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.030) 0:01:30.170 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.030) 0:01:30.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.032) 0:01:30.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.032) 0:01:30.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.032) 0:01:30.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.033) 0:01:30.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.033) 0:01:30.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.033) 0:01:30.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.041) 0:01:30.439 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:35:24 +0000 (0:00:00.126) 0:01:30.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.037) 0:01:30.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.041) 0:01:30.645 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.031) 0:01:30.676 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.035) 0:01:30.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.033) 0:01:30.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.031) 0:01:30.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.031) 0:01:30.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.031) 0:01:30.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.031) 0:01:30.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.048) 0:01:30.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.041) 0:01:30.961 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.042) 0:01:31.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.033) 0:01:31.038 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.035) 0:01:31.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.034) 0:01:31.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.028) 0:01:31.136 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.381) 0:01:31.517 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:35:25 +0000 (0:00:00.039) 0:01:31.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.028) 0:01:31.585 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.036) 0:01:31.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.032) 0:01:31.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.025) 0:01:31.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.033) 0:01:31.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.032) 0:01:31.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.091) 0:01:31.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.026) 0:01:31.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.032) 0:01:31.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.029) 0:01:31.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.031) 0:01:31.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.032) 0:01:31.989 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.036) 0:01:32.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.041) 0:01:32.067 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.039) 0:01:32.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.031) 0:01:32.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.032) 0:01:32.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.031) 0:01:32.203 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.035) 0:01:32.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.033) 0:01:32.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.033) 0:01:32.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.032) 0:01:32.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.033) 0:01:32.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.033) 0:01:32.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.036) 0:01:32.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.034) 0:01:32.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.031) 0:01:32.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:35:26 +0000 (0:00:00.029) 0:01:32.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.031) 0:01:32.569 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.035) 0:01:32.605 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.035) 0:01:32.640 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.033) 0:01:32.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.033) 0:01:32.707 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.039) 0:01:32.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.034) 0:01:32.782 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.038) 0:01:32.821 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.039) 0:01:32.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.031) 0:01:32.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.029) 0:01:32.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.032) 0:01:32.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.031) 0:01:32.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.032) 0:01:33.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.034) 0:01:33.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.030) 0:01:33.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.029) 0:01:33.114 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.029) 0:01:33.143 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.029) 0:01:33.173 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.028) 0:01:33.202 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=542 changed=4 unreachable=0 failed=0 skipped=495 rescued=0 ignored=0 Wednesday 01 June 2022 17:35:27 +0000 (0:00:00.021) 0:01:33.223 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 9.52s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 4.24s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.39s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : set up new/current mounts ------------------ 1.35s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : remove obsolete mounts --------------------- 1.18s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 linux-system-roles.storage : set up new/current mounts ------------------ 1.14s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:2 -------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : get required packages ---------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:35:28 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:35:29 +0000 (0:00:01.342) 0:00:01.366 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_raid_pool_options_nvme_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_raid_pool_options_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:35:29 +0000 (0:00:00.018) 0:00:01.384 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:35:30 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:35:31 +0000 (0:00:01.315) 0:00:01.338 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_raid_pool_options_scsi_generated.yml *************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_raid_pool_options_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options_scsi_generated.yml:3 Wednesday 01 June 2022 17:35:31 +0000 (0:00:00.016) 0:00:01.355 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options_scsi_generated.yml:7 Wednesday 01 June 2022 17:35:33 +0000 (0:00:01.091) 0:00:02.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:2 Wednesday 01 June 2022 17:35:33 +0000 (0:00:00.025) 0:00:02.472 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:17 Wednesday 01 June 2022 17:35:33 +0000 (0:00:00.860) 0:00:03.333 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:35:33 +0000 (0:00:00.040) 0:00:03.373 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:35:34 +0000 (0:00:00.164) 0:00:03.538 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:35:34 +0000 (0:00:00.780) 0:00:04.319 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:35:34 +0000 (0:00:00.081) 0:00:04.400 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:35:34 +0000 (0:00:00.023) 0:00:04.424 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:35:35 +0000 (0:00:00.025) 0:00:04.450 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:35:35 +0000 (0:00:00.198) 0:00:04.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:35:35 +0000 (0:00:00.021) 0:00:04.670 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:35:36 +0000 (0:00:01.094) 0:00:05.764 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:35:36 +0000 (0:00:00.048) 0:00:05.812 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:35:36 +0000 (0:00:00.047) 0:00:05.860 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:35:37 +0000 (0:00:00.718) 0:00:06.579 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:35:37 +0000 (0:00:00.083) 0:00:06.662 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:35:37 +0000 (0:00:00.021) 0:00:06.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:35:37 +0000 (0:00:00.022) 0:00:06.706 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:35:37 +0000 (0:00:00.020) 0:00:06.727 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:35:38 +0000 (0:00:00.853) 0:00:07.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:35:40 +0000 (0:00:01.881) 0:00:09.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.043) 0:00:09.505 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.029) 0:00:09.534 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.561) 0:00:10.096 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.030) 0:00:10.126 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.027) 0:00:10.154 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.033) 0:00:10.187 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.035) 0:00:10.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.034) 0:00:10.257 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.029) 0:00:10.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.028) 0:00:10.314 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.026) 0:00:10.341 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:35:40 +0000 (0:00:00.027) 0:00:10.368 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:35:41 +0000 (0:00:00.528) 0:00:10.896 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:35:41 +0000 (0:00:00.029) 0:00:10.926 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:20 Wednesday 01 June 2022 17:35:42 +0000 (0:00:00.887) 0:00:11.813 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:27 Wednesday 01 June 2022 17:35:42 +0000 (0:00:00.031) 0:00:11.844 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:35:42 +0000 (0:00:00.046) 0:00:11.891 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb", "sdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.561) 0:00:12.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.043) 0:00:12.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.037) 0:00:12.533 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb", "sdc" ] } TASK [Create a RAID1 device] *************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:32 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.035) 0:00:12.568 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.095) 0:00:12.664 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.043) 0:00:12.708 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.546) 0:00:13.255 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.069) 0:00:13.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.030) 0:00:13.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:35:43 +0000 (0:00:00.031) 0:00:13.386 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.067) 0:00:13.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.027) 0:00:13.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.030) 0:00:13.511 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.037) 0:00:13.549 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.033) 0:00:13.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.032) 0:00:13.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.031) 0:00:13.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.030) 0:00:13.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.030) 0:00:13.707 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.044) 0:00:13.752 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:35:44 +0000 (0:00:00.027) 0:00:13.780 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2", "mdadm" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:35:53 +0000 (0:00:09.462) 0:00:23.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:35:53 +0000 (0:00:00.033) 0:00:23.277 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:35:53 +0000 (0:00:00.028) 0:00:23.305 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "create format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2", "mdadm" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:35:53 +0000 (0:00:00.049) 0:00:23.354 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:35:53 +0000 (0:00:00.044) 0:00:23.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:35:53 +0000 (0:00:00.036) 0:00:23.435 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:35:54 +0000 (0:00:00.030) 0:00:23.466 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:35:55 +0000 (0:00:00.996) 0:00:24.463 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:35:56 +0000 (0:00:01.414) 0:00:25.877 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:35:57 +0000 (0:00:00.722) 0:00:26.600 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:35:57 +0000 (0:00:00.404) 0:00:27.004 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:35:57 +0000 (0:00:00.031) 0:00:27.035 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:56 Wednesday 01 June 2022 17:35:58 +0000 (0:00:01.012) 0:00:28.048 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:35:58 +0000 (0:00:00.068) 0:00:28.117 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:35:58 +0000 (0:00:00.051) 0:00:28.168 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:35:58 +0000 (0:00:00.033) 0:00:28.202 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "1f3ece0b-87c3-4f8d-b86c-42822c2c81e8" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "28ffc8ae-c476-4762-a869-e9b3f191adfb" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "b385a1f8-7442-43d6-b0b4-4af3c4516b9b" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "10G", "type": "raid1", "uuid": "5Lavqr-Aww3-dp69-YLXC-BBpM-7QWJ-8fljJW" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "a3a06af4-9ec8-7134-5084-50887e608829" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "a3a06af4-9ec8-7134-5084-50887e608829" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "a3a06af4-9ec8-7134-5084-50887e608829" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:35:59 +0000 (0:00:00.514) 0:00:28.717 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002856", "end": "2022-06-01 13:35:59.065551", "rc": 0, "start": "2022-06-01 13:35:59.062695" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:35:59 +0000 (0:00:00.484) 0:00:29.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003388", "end": "2022-06-01 13:35:59.472370", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:35:59.468982" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:36:00 +0000 (0:00:00.408) 0:00:29.610 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:36:00 +0000 (0:00:00.074) 0:00:29.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:36:00 +0000 (0:00:00.033) 0:00:29.719 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:36:00 +0000 (0:00:00.068) 0:00:29.787 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:36:00 +0000 (0:00:00.042) 0:00:29.829 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:36:00 +0000 (0:00:00.547) 0:00:30.377 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:36:00 +0000 (0:00:00.047) 0:00:30.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.042) 0:00:30.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.040) 0:00:30.507 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.042) 0:00:30.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid1" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.040) 0:00:30.590 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.048) 0:00:30.638 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.063) 0:00:30.701 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.013049", "end": "2022-06-01 13:36:01.013035", "rc": 0, "start": "2022-06-01 13:36:00.999986" } STDOUT: /dev/md/vg1-1: Version : 1.0 Creation Time : Wed Jun 1 13:35:46 2022 Raid Level : raid1 Array Size : 10484608 (10.00 GiB 10.74 GB) Used Dev Size : 10484608 (10.00 GiB 10.74 GB) Raid Devices : 2 Total Devices : 3 Persistence : Superblock is persistent Intent Bitmap : Internal Update Time : Wed Jun 1 13:35:59 2022 State : clean, resyncing Active Devices : 2 Working Devices : 3 Failed Devices : 0 Spare Devices : 1 Consistency Policy : bitmap Resync Status : 28% complete Name : vg1-1 UUID : a3a06af4:9ec87134:50845088:7e608829 Events : 13 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 2 8 33 - spare /dev/sdc1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.453) 0:00:31.155 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.043) 0:00:31.199 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 1\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.043) 0:00:31.242 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.0\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.043) 0:00:31.285 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.043) 0:00:31.329 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.046) 0:00:31.375 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:36:01 +0000 (0:00:00.043) 0:00:31.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.031) 0:00:31.450 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.069) 0:00:31.520 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.096) 0:00:31.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.033) 0:00:31.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.036) 0:00:31.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.034) 0:00:31.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.030) 0:00:31.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.030) 0:00:31.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.030) 0:00:31.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.031) 0:00:31.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.036) 0:00:31.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.034) 0:00:31.917 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.072) 0:00:31.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.046) 0:00:32.035 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.045) 0:00:32.081 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.069) 0:00:32.151 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.039) 0:00:32.190 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.040) 0:00:32.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.034) 0:00:32.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.033) 0:00:32.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.036) 0:00:32.335 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.038) 0:00:32.374 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:36:02 +0000 (0:00:00.036) 0:00:32.410 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.073) 0:00:32.484 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.106) 0:00:32.590 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:32.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:32.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:32.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:32.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.036) 0:00:32.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.032) 0:00:32.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:32.831 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.036) 0:00:32.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.031) 0:00:32.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.037) 0:00:32.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.039) 0:00:32.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.087) 0:00:33.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.035) 0:00:33.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:33.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:33.168 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:33.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.032) 0:00:33.236 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.035) 0:00:33.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.032) 0:00:33.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.032) 0:00:33.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.033) 0:00:33.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:03 +0000 (0:00:00.034) 0:00:33.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.034) 0:00:33.440 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.037) 0:00:33.478 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.035) 0:00:33.513 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.095) 0:00:33.608 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.042) 0:00:33.651 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.142) 0:00:33.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.041) 0:00:33.834 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "1f3ece0b-87c3-4f8d-b86c-42822c2c81e8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "1f3ece0b-87c3-4f8d-b86c-42822c2c81e8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.047) 0:00:33.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.042) 0:00:33.924 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.037) 0:00:33.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.042) 0:00:34.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.034) 0:00:34.039 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.035) 0:00:34.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.032) 0:00:34.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.038) 0:00:34.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.049) 0:00:34.195 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.035) 0:00:34.231 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.039) 0:00:34.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.035) 0:00:34.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.040) 0:00:34.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.043) 0:00:34.390 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:36:04 +0000 (0:00:00.039) 0:00:34.430 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104953.0141215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104953.0141215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22896, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104953.0141215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.426) 0:00:34.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.055) 0:00:34.912 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.038) 0:00:34.951 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.040) 0:00:34.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.039) 0:00:35.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.046) 0:00:35.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.034) 0:00:35.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.035) 0:00:35.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.034) 0:00:35.183 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.044) 0:00:35.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.034) 0:00:35.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.034) 0:00:35.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.035) 0:00:35.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.034) 0:00:35.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:36:05 +0000 (0:00:00.036) 0:00:35.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.099) 0:00:35.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.039) 0:00:35.542 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.038) 0:00:35.581 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.034) 0:00:35.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.034) 0:00:35.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.037) 0:00:35.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.035) 0:00:35.723 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.035) 0:00:35.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.034) 0:00:35.793 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.034) 0:00:35.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.032) 0:00:35.860 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.034) 0:00:35.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:36:06 +0000 (0:00:00.035) 0:00:35.930 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.516) 0:00:36.446 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.405) 0:00:36.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.043) 0:00:36.896 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.037) 0:00:36.934 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.036) 0:00:36.971 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.035) 0:00:37.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.034) 0:00:37.041 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.036) 0:00:37.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.037) 0:00:37.114 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.038) 0:00:37.152 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.041) 0:00:37.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:36:07 +0000 (0:00:00.042) 0:00:37.236 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.040704", "end": "2022-06-01 13:36:07.550808", "rc": 0, "start": "2022-06-01 13:36:07.510104" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.461) 0:00:37.698 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.042) 0:00:37.740 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.045) 0:00:37.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.039) 0:00:37.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.036) 0:00:37.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.036) 0:00:37.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.038) 0:00:37.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.035) 0:00:37.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.050) 0:00:38.022 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.156) 0:00:38.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.043) 0:00:38.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "28ffc8ae-c476-4762-a869-e9b3f191adfb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "28ffc8ae-c476-4762-a869-e9b3f191adfb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.052) 0:00:38.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.043) 0:00:38.317 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:36:08 +0000 (0:00:00.044) 0:00:38.362 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.094) 0:00:38.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.032) 0:00:38.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.032) 0:00:38.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.034) 0:00:38.556 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.033) 0:00:38.590 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.054) 0:00:38.644 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.038) 0:00:38.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.039) 0:00:38.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.030) 0:00:38.753 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.034) 0:00:38.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.056) 0:00:38.844 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.044) 0:00:38.888 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104952.7281215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104952.7281215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22862, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104952.7281215, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.426) 0:00:39.314 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.040) 0:00:39.355 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.040) 0:00:39.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:36:09 +0000 (0:00:00.039) 0:00:39.435 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.034) 0:00:39.470 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.041) 0:00:39.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.035) 0:00:39.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.034) 0:00:39.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.032) 0:00:39.614 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.039) 0:00:39.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.031) 0:00:39.685 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.035) 0:00:39.721 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.032) 0:00:39.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.031) 0:00:39.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.031) 0:00:39.817 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.041) 0:00:39.859 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.042) 0:00:39.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.034) 0:00:39.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.035) 0:00:39.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.034) 0:00:40.007 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.035) 0:00:40.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.034) 0:00:40.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.047) 0:00:40.124 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.038) 0:00:40.163 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.034) 0:00:40.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.032) 0:00:40.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.035) 0:00:40.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:36:10 +0000 (0:00:00.034) 0:00:40.299 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.382) 0:00:40.681 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.401) 0:00:41.083 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.047) 0:00:41.130 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.038) 0:00:41.169 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.036) 0:00:41.206 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.036) 0:00:41.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.034) 0:00:41.278 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.034) 0:00:41.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:36:11 +0000 (0:00:00.037) 0:00:41.349 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.100) 0:00:41.450 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.040) 0:00:41.491 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.044) 0:00:41.536 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.048230", "end": "2022-06-01 13:36:11.844786", "rc": 0, "start": "2022-06-01 13:36:11.796556" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.453) 0:00:41.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.050) 0:00:42.040 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.045) 0:00:42.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.037) 0:00:42.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.037) 0:00:42.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.035) 0:00:42.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.034) 0:00:42.230 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.037) 0:00:42.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:36:12 +0000 (0:00:00.042) 0:00:42.310 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.146) 0:00:42.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.040) 0:00:42.497 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "b385a1f8-7442-43d6-b0b4-4af3c4516b9b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "b385a1f8-7442-43d6-b0b4-4af3c4516b9b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.046) 0:00:42.544 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.043) 0:00:42.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.044) 0:00:42.632 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.044) 0:00:42.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.033) 0:00:42.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.034) 0:00:42.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.039) 0:00:42.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.036) 0:00:42.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.051) 0:00:42.872 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.036) 0:00:42.908 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.040) 0:00:42.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.034) 0:00:42.983 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.035) 0:00:43.019 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.042) 0:00:43.062 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:36:13 +0000 (0:00:00.041) 0:00:43.103 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104952.4691215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104952.4691215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22827, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104952.4691215, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.445) 0:00:43.548 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.041) 0:00:43.590 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.045) 0:00:43.635 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.035) 0:00:43.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.033) 0:00:43.704 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.038) 0:00:43.743 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.032) 0:00:43.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.036) 0:00:43.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.033) 0:00:43.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.042) 0:00:43.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.036) 0:00:43.924 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.035) 0:00:43.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.035) 0:00:43.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.091) 0:00:44.087 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.035) 0:00:44.122 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.039) 0:00:44.162 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.037) 0:00:44.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.035) 0:00:44.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.037) 0:00:44.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.033) 0:00:44.306 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.033) 0:00:44.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.033) 0:00:44.373 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:36:14 +0000 (0:00:00.034) 0:00:44.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:36:15 +0000 (0:00:00.034) 0:00:44.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:36:15 +0000 (0:00:00.036) 0:00:44.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:36:15 +0000 (0:00:00.045) 0:00:44.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:36:15 +0000 (0:00:00.039) 0:00:44.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:36:15 +0000 (0:00:00.037) 0:00:44.601 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:36:15 +0000 (0:00:00.417) 0:00:45.018 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:36:15 +0000 (0:00:00.400) 0:00:45.419 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.042) 0:00:45.461 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.038) 0:00:45.500 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.040) 0:00:45.540 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.037) 0:00:45.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.036) 0:00:45.615 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.035) 0:00:45.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.034) 0:00:45.685 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.044) 0:00:45.729 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.041) 0:00:45.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.044) 0:00:45.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.043406", "end": "2022-06-01 13:36:16.155459", "rc": 0, "start": "2022-06-01 13:36:16.112053" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.480) 0:00:46.296 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.044) 0:00:46.340 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.045) 0:00:46.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:36:16 +0000 (0:00:00.037) 0:00:46.424 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.037) 0:00:46.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.036) 0:00:46.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.035) 0:00:46.533 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.036) 0:00:46.569 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.034) 0:00:46.604 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.029) 0:00:46.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation minus the pool raid options] ************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:58 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.031) 0:00:46.665 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.074) 0:00:46.740 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.050) 0:00:46.790 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.529) 0:00:47.320 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:36:17 +0000 (0:00:00.088) 0:00:47.408 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.038) 0:00:47.447 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.042) 0:00:47.489 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.068) 0:00:47.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.031) 0:00:47.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.034) 0:00:47.623 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.043) 0:00:47.666 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.038) 0:00:47.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.036) 0:00:47.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.033) 0:00:47.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.035) 0:00:47.810 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.034) 0:00:47.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.048) 0:00:47.894 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:36:18 +0000 (0:00:00.032) 0:00:47.926 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2", "mdadm" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:36:20 +0000 (0:00:02.315) 0:00:50.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:36:20 +0000 (0:00:00.034) 0:00:50.276 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:36:20 +0000 (0:00:00.031) 0:00:50.308 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/mapper/vg1-lv1", "/dev/mapper/vg1-lv2", "/dev/mapper/vg1-lv3", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "lvm2", "mdadm" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:36:20 +0000 (0:00:00.047) 0:00:50.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:36:20 +0000 (0:00:00.043) 0:00:50.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:36:20 +0000 (0:00:00.037) 0:00:50.437 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:36:21 +0000 (0:00:00.032) 0:00:50.469 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:36:21 +0000 (0:00:00.727) 0:00:51.197 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test2', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test3', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "mounted" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:36:22 +0000 (0:00:01.175) 0:00:52.372 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:36:23 +0000 (0:00:00.727) 0:00:53.100 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:36:24 +0000 (0:00:00.402) 0:00:53.503 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:36:24 +0000 (0:00:00.032) 0:00:53.535 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert to preserve RAID settings for preexisting pool] ******************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:78 Wednesday 01 June 2022 17:36:25 +0000 (0:00:01.176) 0:00:54.712 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:87 Wednesday 01 June 2022 17:36:25 +0000 (0:00:00.047) 0:00:54.760 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:36:25 +0000 (0:00:00.062) 0:00:54.822 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:36:25 +0000 (0:00:00.052) 0:00:54.875 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:36:25 +0000 (0:00:00.035) 0:00:54.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "2G", "type": "lvm", "uuid": "1f3ece0b-87c3-4f8d-b86c-42822c2c81e8" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "3G", "type": "lvm", "uuid": "28ffc8ae-c476-4762-a869-e9b3f191adfb" }, "/dev/mapper/vg1-lv3": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv3", "size": "3G", "type": "lvm", "uuid": "b385a1f8-7442-43d6-b0b4-4af3c4516b9b" }, "/dev/md/vg1-1": { "fstype": "LVM2_member", "label": "", "name": "/dev/md/vg1-1", "size": "10G", "type": "raid1", "uuid": "5Lavqr-Aww3-dp69-YLXC-BBpM-7QWJ-8fljJW" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "a3a06af4-9ec8-7134-5084-50887e608829" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "a3a06af4-9ec8-7134-5084-50887e608829" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "vg1-1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "a3a06af4-9ec8-7134-5084-50887e608829" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:36:25 +0000 (0:00:00.456) 0:00:55.367 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002859", "end": "2022-06-01 13:36:25.621913", "rc": 0, "start": "2022-06-01 13:36:25.619054" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 /dev/mapper/vg1-lv3 /opt/test3 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:36:26 +0000 (0:00:00.398) 0:00:55.766 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.004056", "end": "2022-06-01 13:36:26.032001", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:36:26.027945" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:36:26 +0000 (0:00:00.413) 0:00:56.179 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:36:26 +0000 (0:00:00.084) 0:00:56.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:36:26 +0000 (0:00:00.039) 0:00:56.304 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:36:26 +0000 (0:00:00.075) 0:00:56.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/md127" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:36:26 +0000 (0:00:00.046) 0:00:56.426 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/md/vg1-1", "pv": "/dev/md127" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.407) 0:00:56.833 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md127) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/md/vg1-1" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/md127" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.047) 0:00:56.881 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.045) 0:00:56.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.081) 0:00:57.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.041) 0:00:57.050 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid1" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.040) 0:00:57.090 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/md/vg1-1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.049) 0:00:57.140 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:36:27 +0000 (0:00:00.063) 0:00:57.203 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/vg1-1" ], "delta": "0:00:00.008231", "end": "2022-06-01 13:36:27.501920", "rc": 0, "start": "2022-06-01 13:36:27.493689" } STDOUT: /dev/md/vg1-1: Version : 1.0 Creation Time : Wed Jun 1 13:35:46 2022 Raid Level : raid1 Array Size : 10484608 (10.00 GiB 10.74 GB) Used Dev Size : 10484608 (10.00 GiB 10.74 GB) Raid Devices : 2 Total Devices : 3 Persistence : Superblock is persistent Intent Bitmap : Internal Update Time : Wed Jun 1 13:36:27 2022 State : clean, resyncing Active Devices : 2 Working Devices : 3 Failed Devices : 0 Spare Devices : 1 Consistency Policy : bitmap Resync Status : 78% complete Name : vg1-1 UUID : a3a06af4:9ec87134:50845088:7e608829 Events : 24 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 2 8 33 - spare /dev/sdc1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.443) 0:00:57.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.043) 0:00:57.690 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 1\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.043) 0:00:57.734 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.0\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.044) 0:00:57.778 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.043) 0:00:57.821 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.041) 0:00:57.863 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.043) 0:00:57.906 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.041) 0:00:57.948 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.074) 0:00:58.023 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.094) 0:00:58.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.034) 0:00:58.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.035) 0:00:58.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.036) 0:00:58.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.033) 0:00:58.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.034) 0:00:58.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.037) 0:00:58.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.035) 0:00:58.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.034) 0:00:58.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:36:28 +0000 (0:00:00.035) 0:00:58.436 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.068) 0:00:58.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.038) 0:00:58.543 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/md/vg1-1) => { "_storage_test_pool_member_path": "/dev/md/vg1-1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.036) 0:00:58.580 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.062) 0:00:58.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.040) 0:00:58.683 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.042) 0:00:58.725 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.034) 0:00:58.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.032) 0:00:58.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.031) 0:00:58.823 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.033) 0:00:58.856 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.034) 0:00:58.891 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.071) 0:00:58.962 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.104) 0:00:59.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.034) 0:00:59.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.033) 0:00:59.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.076) 0:00:59.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.034) 0:00:59.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.126) 0:00:59.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:29 +0000 (0:00:00.037) 0:00:59.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.037) 0:00:59.447 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.038) 0:00:59.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.037) 0:00:59.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.036) 0:00:59.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.038) 0:00:59.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.034) 0:00:59.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.033) 0:00:59.664 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.033) 0:00:59.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.035) 0:00:59.733 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.035) 0:00:59.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.038) 0:00:59.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.035) 0:00:59.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.035) 0:00:59.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.032) 0:00:59.911 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.032) 0:00:59.943 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.034) 0:00:59.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.038) 0:01:00.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.035) 0:01:00.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.034) 0:01:00.086 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.089) 0:01:00.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.041) 0:01:00.216 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.132) 0:01:00.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.040) 0:01:00.390 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "1f3ece0b-87c3-4f8d-b86c-42822c2c81e8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 509827, "block_size": 4096, "block_total": 521728, "block_used": 11901, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1048573, "inode_total": 1048576, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 2088251392, "size_total": 2136997888, "uuid": "1f3ece0b-87c3-4f8d-b86c-42822c2c81e8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:36:30 +0000 (0:00:00.048) 0:01:00.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.045) 0:01:00.484 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.040) 0:01:00.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.043) 0:01:00.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.034) 0:01:00.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.034) 0:01:00.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.036) 0:01:00.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.033) 0:01:00.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.051) 0:01:00.758 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.038) 0:01:00.796 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.049) 0:01:00.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.037) 0:01:00.883 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.033) 0:01:00.917 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.042) 0:01:00.960 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.039) 0:01:01.000 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104953.0141215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104953.0141215, "dev": 5, "device_type": 64770, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22896, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104953.0141215, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:36:31 +0000 (0:00:00.433) 0:01:01.434 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.047) 0:01:01.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.040) 0:01:01.521 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.040) 0:01:01.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.032) 0:01:01.595 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.038) 0:01:01.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.035) 0:01:01.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.088) 0:01:01.757 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.032) 0:01:01.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.041) 0:01:01.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.032) 0:01:01.863 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.034) 0:01:01.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.037) 0:01:01.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.035) 0:01:01.971 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.042) 0:01:02.014 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.044) 0:01:02.058 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.041) 0:01:02.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.035) 0:01:02.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.037) 0:01:02.173 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.035) 0:01:02.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.035) 0:01:02.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.039) 0:01:02.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.036) 0:01:02.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.039) 0:01:02.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.039) 0:01:02.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:36:32 +0000 (0:00:00.034) 0:01:02.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:36:33 +0000 (0:00:00.034) 0:01:02.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:36:33 +0000 (0:00:00.036) 0:01:02.503 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:36:33 +0000 (0:00:00.414) 0:01:02.918 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 2147483648, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:36:33 +0000 (0:00:00.418) 0:01:03.337 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "2147483648" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:36:33 +0000 (0:00:00.042) 0:01:03.379 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:36:33 +0000 (0:00:00.040) 0:01:03.420 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.036) 0:01:03.456 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.034) 0:01:03.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.034) 0:01:03.526 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.034) 0:01:03.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.034) 0:01:03.595 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 2147483648, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.042) 0:01:03.638 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "2147483648" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.047) 0:01:03.686 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.047) 0:01:03.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.040134", "end": "2022-06-01 13:36:34.018004", "rc": 0, "start": "2022-06-01 13:36:33.977870" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.424) 0:01:04.159 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.043) 0:01:04.203 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.048) 0:01:04.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.039) 0:01:04.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.038) 0:01:04.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.038) 0:01:04.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:36:34 +0000 (0:00:00.037) 0:01:04.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.035) 0:01:04.442 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.043) 0:01:04.485 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.183) 0:01:04.669 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.040) 0:01:04.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "28ffc8ae-c476-4762-a869-e9b3f191adfb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "28ffc8ae-c476-4762-a869-e9b3f191adfb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.048) 0:01:04.758 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.041) 0:01:04.800 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.042) 0:01:04.842 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.043) 0:01:04.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.040) 0:01:04.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.035) 0:01:04.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.051) 0:01:05.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.035) 0:01:05.049 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.060) 0:01:05.109 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.041) 0:01:05.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.043) 0:01:05.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.035) 0:01:05.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.036) 0:01:05.266 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.044) 0:01:05.311 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:36:35 +0000 (0:00:00.046) 0:01:05.357 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104952.7281215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104952.7281215, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22862, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104952.7281215, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.425) 0:01:05.783 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.042) 0:01:05.825 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.045) 0:01:05.871 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.043) 0:01:05.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.035) 0:01:05.950 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.042) 0:01:05.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.036) 0:01:06.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.035) 0:01:06.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.036) 0:01:06.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.040) 0:01:06.141 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.032) 0:01:06.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.033) 0:01:06.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.035) 0:01:06.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.039) 0:01:06.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.033) 0:01:06.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.043) 0:01:06.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.039) 0:01:06.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:36:36 +0000 (0:00:00.033) 0:01:06.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.035) 0:01:06.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.035) 0:01:06.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.035) 0:01:06.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.034) 0:01:06.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.034) 0:01:06.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.035) 0:01:06.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.035) 0:01:06.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.037) 0:01:06.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.036) 0:01:06.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.033) 0:01:06.786 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:36:37 +0000 (0:00:00.403) 0:01:07.190 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.430) 0:01:07.620 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.049) 0:01:07.670 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.036) 0:01:07.706 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.031) 0:01:07.738 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.033) 0:01:07.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.031) 0:01:07.803 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.031) 0:01:07.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.031) 0:01:07.867 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.035) 0:01:07.902 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.033) 0:01:07.936 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:36:38 +0000 (0:00:00.043) 0:01:07.980 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.036329", "end": "2022-06-01 13:36:38.305571", "rc": 0, "start": "2022-06-01 13:36:38.269242" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.468) 0:01:08.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.044) 0:01:08.494 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.043) 0:01:08.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.036) 0:01:08.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.039) 0:01:08.614 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.036) 0:01:08.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.035) 0:01:08.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.035) 0:01:08.721 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.039) 0:01:08.761 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.132) 0:01:08.894 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.042) 0:01:08.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "b385a1f8-7442-43d6-b0b4-4af3c4516b9b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770146, "block_size": 4096, "block_total": 783872, "block_used": 13726, "device": "/dev/mapper/vg1-lv3", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test3", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 3154518016, "size_total": 3210739712, "uuid": "b385a1f8-7442-43d6-b0b4-4af3c4516b9b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.049) 0:01:08.985 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.041) 0:01:09.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.040) 0:01:09.068 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.042) 0:01:09.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.035) 0:01:09.147 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.033) 0:01:09.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.033) 0:01:09.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.034) 0:01:09.248 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test3 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test3 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.049) 0:01:09.298 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.041) 0:01:09.339 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.040) 0:01:09.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:36:39 +0000 (0:00:00.034) 0:01:09.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.036) 0:01:09.452 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.042) 0:01:09.495 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.046) 0:01:09.541 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654104952.4691215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654104952.4691215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 22827, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654104952.4691215, "nlink": 1, "path": "/dev/mapper/vg1-lv3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.412) 0:01:09.953 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.041) 0:01:09.995 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.042) 0:01:10.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.037) 0:01:10.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.035) 0:01:10.110 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.039) 0:01:10.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.075) 0:01:10.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.033) 0:01:10.259 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.033) 0:01:10.293 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.041) 0:01:10.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.034) 0:01:10.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:36:40 +0000 (0:00:00.037) 0:01:10.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.035) 0:01:10.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.035) 0:01:10.477 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.036) 0:01:10.513 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.042) 0:01:10.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.038) 0:01:10.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.037) 0:01:10.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.034) 0:01:10.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.032) 0:01:10.699 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.033) 0:01:10.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.033) 0:01:10.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.036) 0:01:10.803 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.038) 0:01:10.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.034) 0:01:10.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.032) 0:01:10.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.033) 0:01:10.941 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.035) 0:01:10.976 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:36:41 +0000 (0:00:00.403) 0:01:11.380 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.404) 0:01:11.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.047) 0:01:11.831 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.045) 0:01:11.876 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.035) 0:01:11.912 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.036) 0:01:11.949 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.041) 0:01:11.991 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.037) 0:01:12.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.042) 0:01:12.071 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.041) 0:01:12.113 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.038) 0:01:12.151 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:36:42 +0000 (0:00:00.045) 0:01:12.197 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv3" ], "delta": "0:00:00.040977", "end": "2022-06-01 13:36:42.500840", "rc": 0, "start": "2022-06-01 13:36:42.459863" } STDOUT: LVM2_LV_NAME=lv3 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.445) 0:01:12.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.056) 0:01:12.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.045) 0:01:12.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.036) 0:01:12.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.034) 0:01:12.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.037) 0:01:12.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.036) 0:01:12.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.037) 0:01:12.927 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.034) 0:01:12.961 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.033) 0:01:12.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the pool created above] ******************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:89 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.035) 0:01:13.030 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.126) 0:01:13.157 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:36:43 +0000 (0:00:00.068) 0:01:13.225 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.549) 0:01:13.775 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.080) 0:01:13.855 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.035) 0:01:13.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.033) 0:01:13.924 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.071) 0:01:13.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.030) 0:01:14.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.033) 0:01:14.059 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "2g" }, { "mount_point": "/opt/test2", "name": "lv2", "size": "3g" }, { "mount_point": "/opt/test3", "name": "lv3", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.047) 0:01:14.107 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.044) 0:01:14.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.035) 0:01:14.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.034) 0:01:14.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.043) 0:01:14.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.038) 0:01:14.304 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.058) 0:01:14.362 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:36:44 +0000 (0:00:00.035) 0:01:14.398 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:36:49 +0000 (0:00:04.421) 0:01:18.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:36:49 +0000 (0:00:00.035) 0:01:18.855 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:36:49 +0000 (0:00:00.032) 0:01:18.887 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/md/vg1-1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/md/vg1-1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:36:49 +0000 (0:00:00.055) 0:01:18.943 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:36:49 +0000 (0:00:00.054) 0:01:18.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:36:49 +0000 (0:00:00.037) 0:01:19.035 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv3', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test3", "src": "/dev/mapper/vg1-lv3", "state": "absent" }, "name": "/opt/test3", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv3" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/vg1-lv1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:36:50 +0000 (0:00:01.161) 0:01:20.197 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:36:51 +0000 (0:00:00.674) 0:01:20.871 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:36:51 +0000 (0:00:00.035) 0:01:20.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:36:52 +0000 (0:00:00.681) 0:01:21.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:36:52 +0000 (0:00:00.414) 0:01:22.003 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:36:52 +0000 (0:00:00.035) 0:01:22.039 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:113 Wednesday 01 June 2022 17:36:53 +0000 (0:00:00.929) 0:01:22.968 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:36:53 +0000 (0:00:00.060) 0:01:23.029 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "2g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/vg1-lv3", "_mount_id": "/dev/mapper/vg1-lv3", "_raw_device": "/dev/mapper/vg1-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdb", "sda", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test3", "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:36:53 +0000 (0:00:00.044) 0:01:23.073 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:36:53 +0000 (0:00:00.027) 0:01:23.101 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:36:54 +0000 (0:00:00.411) 0:01:23.513 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002974", "end": "2022-06-01 13:36:53.768961", "rc": 0, "start": "2022-06-01 13:36:53.765987" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:36:54 +0000 (0:00:00.393) 0:01:23.906 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003143", "end": "2022-06-01 13:36:54.173826", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:36:54.170683" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:36:54 +0000 (0:00:00.404) 0:01:24.311 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:36:54 +0000 (0:00:00.077) 0:01:24.388 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:36:54 +0000 (0:00:00.032) 0:01:24.421 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.066) 0:01:24.487 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.042) 0:01:24.530 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.029) 0:01:24.559 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.029) 0:01:24.588 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.042) 0:01:24.630 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.039) 0:01:24.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.041) 0:01:24.711 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "raid1" }, "changed": false } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.039) 0:01:24.751 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.027) 0:01:24.778 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.054) 0:01:24.833 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.032) 0:01:24.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.032) 0:01:24.898 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.085) 0:01:24.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.033) 0:01:25.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.033) 0:01:25.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.035) 0:01:25.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.035) 0:01:25.121 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.031) 0:01:25.153 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.061) 0:01:25.214 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.085) 0:01:25.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.050) 0:01:25.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.033) 0:01:25.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:55 +0000 (0:00:00.032) 0:01:25.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:25.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:25.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.033) 0:01:25.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.036) 0:01:25.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:25.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.031) 0:01:25.616 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.062) 0:01:25.679 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.038) 0:01:25.717 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.030) 0:01:25.747 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.026) 0:01:25.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:25.806 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.063) 0:01:25.870 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.109) 0:01:25.979 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.034) 0:01:26.014 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.035) 0:01:26.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.030) 0:01:26.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.211 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.035) 0:01:26.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.031) 0:01:26.374 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.031) 0:01:26.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:56 +0000 (0:00:00.032) 0:01:26.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.033) 0:01:26.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.033) 0:01:26.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.031) 0:01:26.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.030) 0:01:26.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.030) 0:01:26.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.033) 0:01:26.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.029) 0:01:26.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.028) 0:01:26.689 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.028) 0:01:26.718 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.029) 0:01:26.747 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.030) 0:01:26.777 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.092) 0:01:26.870 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.035) 0:01:26.906 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.183) 0:01:27.089 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.038) 0:01:27.127 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.044) 0:01:27.172 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.032) 0:01:27.205 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.036) 0:01:27.242 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.032) 0:01:27.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.033) 0:01:27.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.034) 0:01:27.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.037) 0:01:27.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:36:57 +0000 (0:00:00.032) 0:01:27.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.046) 0:01:27.459 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.027) 0:01:27.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.039) 0:01:27.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.031) 0:01:27.557 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.036) 0:01:27.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.032) 0:01:27.626 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.026) 0:01:27.652 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.395) 0:01:28.048 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.036) 0:01:28.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.026) 0:01:28.112 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.037) 0:01:28.150 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.032) 0:01:28.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.027) 0:01:28.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.028) 0:01:28.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.028) 0:01:28.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.029) 0:01:28.297 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.028) 0:01:28.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.035) 0:01:28.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.030) 0:01:28.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:36:58 +0000 (0:00:00.031) 0:01:28.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.035) 0:01:28.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:28.490 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.039) 0:01:28.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.040) 0:01:28.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:28.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:28.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:28.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.034) 0:01:28.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.041) 0:01:28.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.041) 0:01:28.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:28.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.033) 0:01:28.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.033) 0:01:28.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:28.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.031) 0:01:28.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.036) 0:01:28.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.031) 0:01:29.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:29.050 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.035) 0:01:29.085 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:29.117 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.032) 0:01:29.149 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.036) 0:01:29.186 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.034) 0:01:29.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.033) 0:01:29.254 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.038) 0:01:29.292 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.035) 0:01:29.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.033) 0:01:29.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:36:59 +0000 (0:00:00.038) 0:01:29.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.099) 0:01:29.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.032) 0:01:29.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.030) 0:01:29.563 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.029) 0:01:29.592 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.029) 0:01:29.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.029) 0:01:29.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.033) 0:01:29.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.035) 0:01:29.721 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.123) 0:01:29.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.035) 0:01:29.881 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.040) 0:01:29.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.031) 0:01:29.953 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.042) 0:01:29.995 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.032) 0:01:30.028 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.031) 0:01:30.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.030) 0:01:30.090 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.032) 0:01:30.123 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.033) 0:01:30.157 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.049) 0:01:30.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.027) 0:01:30.233 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.036) 0:01:30.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.030) 0:01:30.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.032) 0:01:30.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.031) 0:01:30.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:37:00 +0000 (0:00:00.030) 0:01:30.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.408) 0:01:30.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.041) 0:01:30.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.027) 0:01:30.873 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.036) 0:01:30.909 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.030) 0:01:30.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.028) 0:01:30.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.034) 0:01:31.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.033) 0:01:31.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.032) 0:01:31.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.026) 0:01:31.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.031) 0:01:31.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.032) 0:01:31.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.035) 0:01:31.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.033) 0:01:31.228 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.032) 0:01:31.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.040) 0:01:31.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.038) 0:01:31.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.032) 0:01:31.372 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.033) 0:01:31.406 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:37:01 +0000 (0:00:00.032) 0:01:31.439 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.033) 0:01:31.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.033) 0:01:31.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:31.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.033) 0:01:31.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.035) 0:01:31.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:31.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.031) 0:01:31.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:31.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.031) 0:01:31.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.031) 0:01:31.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.035) 0:01:31.802 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.081) 0:01:31.883 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:31.915 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.031) 0:01:31.947 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.030) 0:01:31.978 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.033) 0:01:32.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.034) 0:01:32.045 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.040) 0:01:32.085 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.038) 0:01:32.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:32.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.030) 0:01:32.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:32.219 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:32.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.037) 0:01:32.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.044) 0:01:32.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.034) 0:01:32.368 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.032) 0:01:32.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:37:02 +0000 (0:00:00.030) 0:01:32.432 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.035) 0:01:32.468 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.137) 0:01:32.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.036) 0:01:32.643 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.040) 0:01:32.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.031) 0:01:32.714 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.035) 0:01:32.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.030) 0:01:32.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.032) 0:01:32.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.029) 0:01:32.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.029) 0:01:32.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.031) 0:01:32.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.047) 0:01:32.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.027) 0:01:32.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.040) 0:01:33.019 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.032) 0:01:33.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.033) 0:01:33.085 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.031) 0:01:33.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:37:03 +0000 (0:00:00.027) 0:01:33.145 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.392) 0:01:33.537 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.038) 0:01:33.576 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.027) 0:01:33.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.036) 0:01:33.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.032) 0:01:33.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.028) 0:01:33.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.032) 0:01:33.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.031) 0:01:33.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.029) 0:01:33.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.029) 0:01:33.824 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.031) 0:01:33.855 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.031) 0:01:33.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.031) 0:01:33.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.032) 0:01:33.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.032) 0:01:33.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.043) 0:01:34.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.037) 0:01:34.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.035) 0:01:34.100 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.033) 0:01:34.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.032) 0:01:34.166 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.031) 0:01:34.198 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.084) 0:01:34.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.034) 0:01:34.316 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.034) 0:01:34.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.035) 0:01:34.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:37:04 +0000 (0:00:00.033) 0:01:34.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.031) 0:01:34.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.035) 0:01:34.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.032) 0:01:34.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.033) 0:01:34.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.043) 0:01:34.596 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.034) 0:01:34.631 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.039) 0:01:34.670 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.041) 0:01:34.711 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.037) 0:01:34.748 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.034) 0:01:34.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.034) 0:01:34.817 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.036) 0:01:34.853 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.038) 0:01:34.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.038) 0:01:34.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.033) 0:01:34.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.033) 0:01:34.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.032) 0:01:35.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.031) 0:01:35.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.032) 0:01:35.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.036) 0:01:35.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.033) 0:01:35.164 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.033) 0:01:35.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.033) 0:01:35.230 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.031) 0:01:35.262 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=544 changed=4 unreachable=0 failed=0 skipped=495 rescued=0 ignored=0 Wednesday 01 June 2022 17:37:05 +0000 (0:00:00.017) 0:01:35.279 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 9.46s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 4.42s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : set up new/current mounts ------------------ 1.41s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : Update facts ------------------------------- 1.18s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : set up new/current mounts ------------------ 1.18s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : remove obsolete mounts --------------------- 1.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.09s /tmp/tmp7247_7fr/tests/tests_raid_pool_options_scsi_generated.yml:3 ----------- linux-system-roles.storage : Update facts ------------------------------- 1.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.00s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.86s /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml:2 -------------------------- linux-system-roles.storage : make sure required packages are installed --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:37:06 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:37:08 +0000 (0:00:01.392) 0:00:01.415 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.39s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_raid_volume_options.yml **************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:2 Wednesday 01 June 2022 17:37:08 +0000 (0:00:00.013) 0:00:01.429 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:10 Wednesday 01 June 2022 17:37:09 +0000 (0:00:01.104) 0:00:02.533 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:37:09 +0000 (0:00:00.039) 0:00:02.572 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:37:09 +0000 (0:00:00.158) 0:00:02.731 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:37:09 +0000 (0:00:00.535) 0:00:03.267 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:37:09 +0000 (0:00:00.076) 0:00:03.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:37:09 +0000 (0:00:00.022) 0:00:03.366 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:37:09 +0000 (0:00:00.023) 0:00:03.389 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:37:10 +0000 (0:00:00.195) 0:00:03.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:37:10 +0000 (0:00:00.020) 0:00:03.605 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:37:11 +0000 (0:00:01.083) 0:00:04.688 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:37:11 +0000 (0:00:00.048) 0:00:04.736 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:37:11 +0000 (0:00:00.055) 0:00:04.792 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:37:12 +0000 (0:00:00.736) 0:00:05.529 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:37:12 +0000 (0:00:00.081) 0:00:05.610 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:37:12 +0000 (0:00:00.020) 0:00:05.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:37:12 +0000 (0:00:00.023) 0:00:05.654 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:37:12 +0000 (0:00:00.021) 0:00:05.676 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:37:13 +0000 (0:00:00.851) 0:00:06.527 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:37:16 +0000 (0:00:02.897) 0:00:09.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.057) 0:00:09.483 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.041) 0:00:09.524 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.564) 0:00:10.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.066) 0:00:10.156 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.028) 0:00:10.184 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.034) 0:00:10.219 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.034) 0:00:10.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.033) 0:00:10.287 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.027) 0:00:10.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.032) 0:00:10.347 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:37:16 +0000 (0:00:00.029) 0:00:10.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:37:17 +0000 (0:00:00.029) 0:00:10.407 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:37:17 +0000 (0:00:00.487) 0:00:10.894 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:37:17 +0000 (0:00:00.029) 0:00:10.923 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:13 Wednesday 01 June 2022 17:37:18 +0000 (0:00:00.880) 0:00:11.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:20 Wednesday 01 June 2022 17:37:18 +0000 (0:00:00.031) 0:00:11.834 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:37:18 +0000 (0:00:00.045) 0:00:11.880 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb", "sdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.549) 0:00:12.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.039) 0:00:12.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.030) 0:00:12.500 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb", "sdc" ] } TASK [Create a RAID0 device mounted on "/opt/test1"] *************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:25 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.034) 0:00:12.534 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.054) 0:00:12.588 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.043) 0:00:12.632 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.536) 0:00:13.169 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.070) 0:00:13.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.031) 0:00:13.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.032) 0:00:13.304 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.060) 0:00:13.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:37:19 +0000 (0:00:00.026) 0:00:13.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.032) 0:00:13.423 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.069) 0:00:13.492 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb", "sdc" ], "mount_point": "/opt/test1", "name": "test1", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.037) 0:00:13.530 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.030) 0:00:13.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.029) 0:00:13.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.030) 0:00:13.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.029) 0:00:13.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.046) 0:00:13.697 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:37:20 +0000 (0:00:00.028) 0:00:13.726 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:37:28 +0000 (0:00:08.264) 0:00:21.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:37:28 +0000 (0:00:00.034) 0:00:22.025 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:37:28 +0000 (0:00:00.031) 0:00:22.057 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:37:28 +0000 (0:00:00.042) 0:00:22.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:37:28 +0000 (0:00:00.036) 0:00:22.135 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:37:28 +0000 (0:00:00.038) 0:00:22.173 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:37:28 +0000 (0:00:00.030) 0:00:22.204 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:37:29 +0000 (0:00:01.056) 0:00:23.261 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:37:30 +0000 (0:00:00.600) 0:00:23.861 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:37:31 +0000 (0:00:00.694) 0:00:24.556 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:37:31 +0000 (0:00:00.404) 0:00:24.961 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:37:31 +0000 (0:00:00.029) 0:00:24.991 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:40 Wednesday 01 June 2022 17:37:32 +0000 (0:00:00.913) 0:00:25.904 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:37:32 +0000 (0:00:00.061) 0:00:25.966 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:37:32 +0000 (0:00:00.031) 0:00:25.997 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:37:32 +0000 (0:00:00.040) 0:00:26.038 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "05c637b5-1284-4bb1-9b86-ff10a90ea4cb" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "e8db2aec-35f2-5f5f-0431-75f94f7d1a0c" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "e8db2aec-35f2-5f5f-0431-75f94f7d1a0c" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "e8db2aec-35f2-5f5f-0431-75f94f7d1a0c" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:37:33 +0000 (0:00:00.569) 0:00:26.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003238", "end": "2022-06-01 13:37:33.037609", "rc": 0, "start": "2022-06-01 13:37:33.034371" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:37:33 +0000 (0:00:00.525) 0:00:27.133 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003562", "end": "2022-06-01 13:37:33.445074", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:37:33.441512" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.402) 0:00:27.535 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.029) 0:00:27.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.033) 0:00:27.598 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.064) 0:00:27.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.036) 0:00:27.699 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.116) 0:00:27.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/md127" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.036) 0:00:27.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592070, "block_size": 4096, "block_total": 2618592, "block_used": 26522, "device": "/dev/md127", "fstype": "xfs", "inode_available": 5242301, "inode_total": 5242304, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617118720, "size_total": 10725752832, "uuid": "05c637b5-1284-4bb1-9b86-ff10a90ea4cb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592070, "block_size": 4096, "block_total": 2618592, "block_used": 26522, "device": "/dev/md127", "fstype": "xfs", "inode_available": 5242301, "inode_total": 5242304, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617118720, "size_total": 10725752832, "uuid": "05c637b5-1284-4bb1-9b86-ff10a90ea4cb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.044) 0:00:27.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.040) 0:00:27.937 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.037) 0:00:27.975 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.040) 0:00:28.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.032) 0:00:28.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.039) 0:00:28.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.047) 0:00:28.135 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.049) 0:00:28.184 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.050) 0:00:28.235 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.037) 0:00:28.272 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.038) 0:00:28.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.033) 0:00:28.343 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:37:34 +0000 (0:00:00.034) 0:00:28.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.043) 0:00:28.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.037) 0:00:28.459 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105047.8101215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105047.8101215, "dev": 5, "device_type": 2431, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 23389, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105047.8101215, "nlink": 1, "path": "/dev/md/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.464) 0:00:28.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.038) 0:00:28.962 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.036) 0:00:28.999 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.033) 0:00:29.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid1" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.037) 0:00:29.070 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.036) 0:00:29.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.032) 0:00:29.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.032) 0:00:29.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.030) 0:00:29.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.037) 0:00:29.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.033) 0:00:29.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.031) 0:00:29.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.032) 0:00:29.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:37:35 +0000 (0:00:00.033) 0:00:29.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.032) 0:00:29.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.039) 0:00:29.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.038) 0:00:29.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.031) 0:00:29.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.030) 0:00:29.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.030) 0:00:29.575 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.032) 0:00:29.607 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/test1" ], "delta": "0:00:00.008351", "end": "2022-06-01 13:37:35.923151", "rc": 0, "start": "2022-06-01 13:37:35.914800" } STDOUT: /dev/md/test1: Version : 1.0 Creation Time : Wed Jun 1 13:37:22 2022 Raid Level : raid1 Array Size : 10484608 (10.00 GiB 10.74 GB) Used Dev Size : 10484608 (10.00 GiB 10.74 GB) Raid Devices : 2 Total Devices : 3 Persistence : Superblock is persistent Intent Bitmap : Internal Update Time : Wed Jun 1 13:37:35 2022 State : clean, resyncing Active Devices : 2 Working Devices : 3 Failed Devices : 0 Spare Devices : 1 Consistency Policy : bitmap Resync Status : 26% complete Name : test1 UUID : e8db2aec:35f25f5f:043175f9:4f7d1a0c Events : 9 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 2 8 33 - spare /dev/sdc1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.408) 0:00:30.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.048) 0:00:30.064 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 1\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.103) 0:00:30.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.0\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.041) 0:00:30.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.040) 0:00:30.249 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.043) 0:00:30.293 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:37:36 +0000 (0:00:00.043) 0:00:30.336 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.483) 0:00:30.820 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.034) 0:00:30.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.033) 0:00:30.888 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.037) 0:00:30.925 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.032) 0:00:30.957 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.033) 0:00:30.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.037) 0:00:31.028 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.035) 0:00:31.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.033) 0:00:31.097 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.037) 0:00:31.134 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.034) 0:00:31.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.032) 0:00:31.201 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.036) 0:00:31.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.035) 0:00:31.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.035) 0:00:31.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.035) 0:00:31.344 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:37:37 +0000 (0:00:00.032) 0:00:31.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.034) 0:00:31.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.035) 0:00:31.447 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.033) 0:00:31.481 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Re-run the same invocation without the RAID params] ********************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:42 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.033) 0:00:31.514 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.066) 0:00:31.581 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.050) 0:00:31.631 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.543) 0:00:32.174 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.073) 0:00:32.248 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.036) 0:00:32.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.032) 0:00:32.316 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:37:38 +0000 (0:00:00.063) 0:00:32.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.028) 0:00:32.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.035) 0:00:32.444 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.038) 0:00:32.482 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb", "sdc" ], "mount_point": "/opt/test1", "name": "test1", "state": "present", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.038) 0:00:32.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.032) 0:00:32.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.031) 0:00:32.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.032) 0:00:32.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.037) 0:00:32.656 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.048) 0:00:32.704 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:37:39 +0000 (0:00:00.031) 0:00:32.736 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "mounted" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:37:40 +0000 (0:00:01.593) 0:00:34.329 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:37:40 +0000 (0:00:00.032) 0:00:34.361 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:37:40 +0000 (0:00:00.028) 0:00:34.389 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "mounted" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "present", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:37:41 +0000 (0:00:00.039) 0:00:34.429 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:37:41 +0000 (0:00:00.035) 0:00:34.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "present", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:37:41 +0000 (0:00:00.037) 0:00:34.502 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:37:41 +0000 (0:00:00.033) 0:00:34.536 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:37:41 +0000 (0:00:00.698) 0:00:35.234 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:37:42 +0000 (0:00:00.420) 0:00:35.655 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:37:42 +0000 (0:00:00.714) 0:00:36.370 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:37:43 +0000 (0:00:00.400) 0:00:36.771 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:37:43 +0000 (0:00:00.030) 0:00:36.801 ******** ok: [/cache/rhel-x.qcow2] TASK [Assert to preserve RAID settings for preexisting volume] ***************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:53 Wednesday 01 June 2022 17:37:44 +0000 (0:00:00.915) 0:00:37.716 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:62 Wednesday 01 June 2022 17:37:44 +0000 (0:00:00.040) 0:00:37.756 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:37:44 +0000 (0:00:00.069) 0:00:37.825 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:37:44 +0000 (0:00:00.035) 0:00:37.860 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:37:44 +0000 (0:00:00.042) 0:00:37.902 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "05c637b5-1284-4bb1-9b86-ff10a90ea4cb" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "e8db2aec-35f2-5f5f-0431-75f94f7d1a0c" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "e8db2aec-35f2-5f5f-0431-75f94f7d1a0c" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "e8db2aec-35f2-5f5f-0431-75f94f7d1a0c" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:37:44 +0000 (0:00:00.418) 0:00:38.321 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003450", "end": "2022-06-01 13:37:44.636919", "rc": 0, "start": "2022-06-01 13:37:44.633469" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:37:45 +0000 (0:00:00.438) 0:00:38.760 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003817", "end": "2022-06-01 13:37:45.095796", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:37:45.091979" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:37:45 +0000 (0:00:00.428) 0:00:39.188 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:37:45 +0000 (0:00:00.031) 0:00:39.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:37:45 +0000 (0:00:00.033) 0:00:39.254 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:37:45 +0000 (0:00:00.065) 0:00:39.320 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:37:45 +0000 (0:00:00.037) 0:00:39.357 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.113) 0:00:39.471 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/md127" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.036) 0:00:39.508 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592070, "block_size": 4096, "block_total": 2618592, "block_used": 26522, "device": "/dev/md127", "fstype": "xfs", "inode_available": 5242301, "inode_total": 5242304, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617118720, "size_total": 10725752832, "uuid": "05c637b5-1284-4bb1-9b86-ff10a90ea4cb" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592070, "block_size": 4096, "block_total": 2618592, "block_used": 26522, "device": "/dev/md127", "fstype": "xfs", "inode_available": 5242301, "inode_total": 5242304, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617118720, "size_total": 10725752832, "uuid": "05c637b5-1284-4bb1-9b86-ff10a90ea4cb" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.043) 0:00:39.552 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.040) 0:00:39.593 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.037) 0:00:39.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.041) 0:00:39.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.032) 0:00:39.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.032) 0:00:39.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.032) 0:00:39.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.032) 0:00:39.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.054) 0:00:39.858 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.038) 0:00:39.896 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.039) 0:00:39.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.032) 0:00:39.968 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.032) 0:00:40.001 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.040) 0:00:40.041 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:37:46 +0000 (0:00:00.041) 0:00:40.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105047.8101215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105047.8101215, "dev": 5, "device_type": 2431, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 23389, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105047.8101215, "nlink": 1, "path": "/dev/md/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.406) 0:00:40.489 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.039) 0:00:40.528 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.035) 0:00:40.564 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.036) 0:00:40.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid1" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.038) 0:00:40.639 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.037) 0:00:40.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.034) 0:00:40.710 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.033) 0:00:40.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.033) 0:00:40.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.039) 0:00:40.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.034) 0:00:40.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.033) 0:00:40.885 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.033) 0:00:40.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.031) 0:00:40.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.037) 0:00:40.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.044) 0:00:41.032 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.038) 0:00:41.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.033) 0:00:41.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.036) 0:00:41.140 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.034) 0:00:41.175 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:37:47 +0000 (0:00:00.032) 0:00:41.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/test1" ], "delta": "0:00:00.008505", "end": "2022-06-01 13:37:47.531213", "rc": 0, "start": "2022-06-01 13:37:47.522708" } STDOUT: /dev/md/test1: Version : 1.0 Creation Time : Wed Jun 1 13:37:22 2022 Raid Level : raid1 Array Size : 10484608 (10.00 GiB 10.74 GB) Used Dev Size : 10484608 (10.00 GiB 10.74 GB) Raid Devices : 2 Total Devices : 3 Persistence : Superblock is persistent Intent Bitmap : Internal Update Time : Wed Jun 1 13:37:45 2022 State : clean, resyncing Active Devices : 2 Working Devices : 3 Failed Devices : 0 Spare Devices : 1 Consistency Policy : bitmap Resync Status : 47% complete Name : test1 UUID : e8db2aec:35f25f5f:043175f9:4f7d1a0c Events : 13 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 2 8 33 - spare /dev/sdc1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.419) 0:00:41.627 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.042) 0:00:41.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 1\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.038) 0:00:41.708 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.0\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.041) 0:00:41.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.087) 0:00:41.837 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.045) 0:00:41.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.042) 0:00:41.925 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.409) 0:00:42.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:37:48 +0000 (0:00:00.037) 0:00:42.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.034) 0:00:42.406 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.036) 0:00:42.443 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.032) 0:00:42.475 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.032) 0:00:42.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.033) 0:00:42.541 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.033) 0:00:42.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.055) 0:00:42.629 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.060) 0:00:42.689 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.036) 0:00:42.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.032) 0:00:42.758 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.032) 0:00:42.791 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.031) 0:00:42.822 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.032) 0:00:42.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.039) 0:00:42.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.037) 0:00:42.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.036) 0:00:42.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.032) 0:00:43.002 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.032) 0:00:43.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Remove the disk device created above] ************************************ task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:64 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.033) 0:00:43.067 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.083) 0:00:43.151 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:37:49 +0000 (0:00:00.052) 0:00:43.203 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.736) 0:00:43.940 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.076) 0:00:44.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.035) 0:00:44.051 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.036) 0:00:44.088 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.064) 0:00:44.153 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.027) 0:00:44.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.031) 0:00:44.212 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.035) 0:00:44.247 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb", "sdc" ], "mount_point": "/opt/test1", "name": "test1", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "absent", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.037) 0:00:44.285 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.032) 0:00:44.318 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.034) 0:00:44.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:37:50 +0000 (0:00:00.030) 0:00:44.383 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:37:51 +0000 (0:00:00.035) 0:00:44.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:37:51 +0000 (0:00:00.049) 0:00:44.468 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:37:51 +0000 (0:00:00.030) 0:00:44.499 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/md/test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/md/test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:37:53 +0000 (0:00:02.767) 0:00:47.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:37:53 +0000 (0:00:00.032) 0:00:47.299 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:37:53 +0000 (0:00:00.031) 0:00:47.331 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/md/test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/md/test1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:37:53 +0000 (0:00:00.045) 0:00:47.376 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:37:54 +0000 (0:00:00.037) 0:00:47.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "absent", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:37:54 +0000 (0:00:00.039) 0:00:47.454 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=05c637b5-1284-4bb1-9b86-ff10a90ea4cb" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:37:54 +0000 (0:00:00.415) 0:00:47.869 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:37:55 +0000 (0:00:00.700) 0:00:48.569 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:37:55 +0000 (0:00:00.033) 0:00:48.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:37:55 +0000 (0:00:00.679) 0:00:49.281 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:37:56 +0000 (0:00:00.405) 0:00:49.687 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:37:56 +0000 (0:00:00.032) 0:00:49.719 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:79 Wednesday 01 June 2022 17:37:57 +0000 (0:00:00.870) 0:00:50.589 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:37:57 +0000 (0:00:00.067) 0:00:50.657 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:37:57 +0000 (0:00:00.034) 0:00:50.691 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": "0 B", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 10736238592, "state": "absent", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:37:57 +0000 (0:00:00.040) 0:00:50.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:37:57 +0000 (0:00:00.393) 0:00:51.126 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002911", "end": "2022-06-01 13:37:57.428660", "rc": 0, "start": "2022-06-01 13:37:57.425749" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.396) 0:00:51.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002888", "end": "2022-06-01 13:37:57.821347", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:37:57.818459" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.396) 0:00:51.918 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.030) 0:00:51.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.032) 0:00:51.981 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.063) 0:00:52.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.078) 0:00:52.123 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.110) 0:00:52.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.036) 0:00:52.271 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.038) 0:00:52.309 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.032) 0:00:52.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:37:58 +0000 (0:00:00.035) 0:00:52.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.031) 0:00:52.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.031) 0:00:52.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.032) 0:00:52.474 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.028) 0:00:52.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.028) 0:00:52.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.044) 0:00:52.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.026) 0:00:52.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.036) 0:00:52.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.031) 0:00:52.670 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.030) 0:00:52.701 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.030) 0:00:52.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.025) 0:00:52.757 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.386) 0:00:53.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.036) 0:00:53.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.025) 0:00:53.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.034) 0:00:53.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid1" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.036) 0:00:53.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.027) 0:00:53.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.031) 0:00:53.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:37:59 +0000 (0:00:00.032) 0:00:53.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:53.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.028) 0:00:53.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.031) 0:00:53.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:53.493 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.031) 0:00:53.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.031) 0:00:53.557 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.031) 0:00:53.588 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.041) 0:00:53.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.036) 0:00:53.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.031) 0:00:53.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.035) 0:00:53.733 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.031) 0:00:53.764 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.030) 0:00:53.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.034) 0:00:53.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:53.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.034) 0:00:53.895 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:53.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.037) 0:00:53.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.035) 0:00:54.001 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.034) 0:00:54.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.031) 0:00:54.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.030) 0:00:54.098 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:54.131 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.035) 0:00:54.166 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.033) 0:00:54.199 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.033) 0:00:54.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:54.266 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:54.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.032) 0:00:54.331 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:38:00 +0000 (0:00:00.037) 0:00:54.368 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.034) 0:00:54.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.080) 0:00:54.483 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.038) 0:00:54.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.032) 0:00:54.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.031) 0:00:54.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.031) 0:00:54.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.030) 0:00:54.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.035) 0:00:54.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.033) 0:00:54.717 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.034) 0:00:54.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=219 changed=4 unreachable=0 failed=0 skipped=158 rescued=0 ignored=0 Wednesday 01 June 2022 17:38:01 +0000 (0:00:00.018) 0:00:54.770 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.26s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 2.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.39s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.10s /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:2 ------------------------ linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.92s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.74s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:38:02 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:38:03 +0000 (0:00:01.355) 0:00:01.378 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_raid_volume_options_nvme_generated.yml ************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_raid_volume_options_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:38:03 +0000 (0:00:00.022) 0:00:01.400 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:38:04 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:38:05 +0000 (0:00:01.309) 0:00:01.333 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.31s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_raid_volume_options_scsi_generated.yml ************************* 2 plays in /tmp/tmp7247_7fr/tests/tests_raid_volume_options_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options_scsi_generated.yml:3 Wednesday 01 June 2022 17:38:05 +0000 (0:00:00.015) 0:00:01.348 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options_scsi_generated.yml:7 Wednesday 01 June 2022 17:38:06 +0000 (0:00:01.170) 0:00:02.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:2 Wednesday 01 June 2022 17:38:06 +0000 (0:00:00.026) 0:00:02.545 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:10 Wednesday 01 June 2022 17:38:07 +0000 (0:00:00.862) 0:00:03.407 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:38:07 +0000 (0:00:00.038) 0:00:03.445 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:38:07 +0000 (0:00:00.163) 0:00:03.609 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:38:08 +0000 (0:00:00.540) 0:00:04.150 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:38:08 +0000 (0:00:00.072) 0:00:04.222 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:38:08 +0000 (0:00:00.021) 0:00:04.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:38:08 +0000 (0:00:00.020) 0:00:04.265 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:38:08 +0000 (0:00:00.198) 0:00:04.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:38:08 +0000 (0:00:00.019) 0:00:04.483 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:38:09 +0000 (0:00:01.075) 0:00:05.558 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:38:09 +0000 (0:00:00.049) 0:00:05.607 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:38:09 +0000 (0:00:00.047) 0:00:05.655 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:38:10 +0000 (0:00:00.703) 0:00:06.358 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:38:10 +0000 (0:00:00.083) 0:00:06.441 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:38:10 +0000 (0:00:00.021) 0:00:06.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:38:10 +0000 (0:00:00.022) 0:00:06.485 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:38:10 +0000 (0:00:00.020) 0:00:06.506 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:38:11 +0000 (0:00:00.824) 0:00:07.331 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:38:13 +0000 (0:00:01.867) 0:00:09.199 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:38:13 +0000 (0:00:00.045) 0:00:09.245 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:38:13 +0000 (0:00:00.027) 0:00:09.272 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.571) 0:00:09.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.029) 0:00:09.873 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.029) 0:00:09.903 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.036) 0:00:09.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.032) 0:00:09.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.032) 0:00:10.005 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.028) 0:00:10.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.030) 0:00:10.063 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.031) 0:00:10.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.029) 0:00:10.124 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.516) 0:00:10.641 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:38:14 +0000 (0:00:00.028) 0:00:10.670 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:13 Wednesday 01 June 2022 17:38:15 +0000 (0:00:00.856) 0:00:11.526 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:20 Wednesday 01 June 2022 17:38:15 +0000 (0:00:00.030) 0:00:11.557 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:38:15 +0000 (0:00:00.076) 0:00:11.633 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "sda", "sdb", "sdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:38:16 +0000 (0:00:00.495) 0:00:12.129 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:38:16 +0000 (0:00:00.036) 0:00:12.166 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:38:16 +0000 (0:00:00.030) 0:00:12.196 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "sda", "sdb", "sdc" ] } TASK [Create a RAID0 device mounted on "/opt/test1"] *************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:25 Wednesday 01 June 2022 17:38:16 +0000 (0:00:00.037) 0:00:12.234 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:38:16 +0000 (0:00:00.056) 0:00:12.290 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:38:16 +0000 (0:00:00.045) 0:00:12.335 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.542) 0:00:12.878 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.069) 0:00:12.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.028) 0:00:12.976 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.030) 0:00:13.006 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.061) 0:00:13.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.024) 0:00:13.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.027) 0:00:13.120 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.029) 0:00:13.149 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb", "sdc" ], "mount_point": "/opt/test1", "name": "test1", "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "state": "present", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.036) 0:00:13.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.029) 0:00:13.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.027) 0:00:13.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.027) 0:00:13.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.027) 0:00:13.298 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.040) 0:00:13.338 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:38:17 +0000 (0:00:00.030) 0:00:13.368 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3517ecef-b308-482c-8907-7eb83fc04137", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=3517ecef-b308-482c-8907-7eb83fc04137", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:38:25 +0000 (0:00:08.248) 0:00:21.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:38:25 +0000 (0:00:00.033) 0:00:21.650 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:38:25 +0000 (0:00:00.030) 0:00:21.681 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "mdmember" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "mdmember" }, { "action": "create device", "device": "/dev/md/test1", "fs_type": null }, { "action": "create format", "device": "/dev/md/test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/md/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3517ecef-b308-482c-8907-7eb83fc04137", "state": "mounted" } ], "packages": [ "xfsprogs", "dosfstools", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=3517ecef-b308-482c-8907-7eb83fc04137", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:38:26 +0000 (0:00:00.044) 0:00:21.725 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:38:26 +0000 (0:00:00.077) 0:00:21.802 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=3517ecef-b308-482c-8907-7eb83fc04137", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:38:26 +0000 (0:00:00.040) 0:00:21.843 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:38:26 +0000 (0:00:00.031) 0:00:21.874 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:38:27 +0000 (0:00:00.966) 0:00:22.840 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=3517ecef-b308-482c-8907-7eb83fc04137', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3517ecef-b308-482c-8907-7eb83fc04137", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=3517ecef-b308-482c-8907-7eb83fc04137" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:38:27 +0000 (0:00:00.546) 0:00:23.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:38:28 +0000 (0:00:00.704) 0:00:24.091 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:38:28 +0000 (0:00:00.382) 0:00:24.474 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:38:28 +0000 (0:00:00.033) 0:00:24.507 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:40 Wednesday 01 June 2022 17:38:29 +0000 (0:00:00.871) 0:00:25.379 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:38:29 +0000 (0:00:00.054) 0:00:25.433 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:38:29 +0000 (0:00:00.032) 0:00:25.465 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/md/test1", "_kernel_device": "/dev/md127", "_mount_id": "UUID=3517ecef-b308-482c-8907-7eb83fc04137", "_raw_device": "/dev/md/test1", "_raw_kernel_device": "/dev/md127", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": 2, "raid_level": "raid1", "raid_metadata_version": "1.0", "raid_spare_count": 1, "size": 0, "state": "present", "type": "raid", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:38:29 +0000 (0:00:00.038) 0:00:25.504 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:38:30 +0000 (0:00:00.568) 0:00:26.073 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003540", "end": "2022-06-01 13:38:30.199282", "rc": 0, "start": "2022-06-01 13:38:30.195742" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3517ecef-b308-482c-8907-7eb83fc04137 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:38:30 +0000 (0:00:00.555) 0:00:26.628 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003396", "end": "2022-06-01 13:38:30.612438", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:38:30.609042" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.405) 0:00:27.033 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.030) 0:00:27.064 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.034) 0:00:27.099 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.066) 0:00:27.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.039) 0:00:27.205 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.175) 0:00:27.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/md127" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.044) 0:00:27.425 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2592070, "block_size": 4096, "block_total": 2618592, "block_used": 26522, "device": "/dev/md127", "fstype": "xfs", "inode_available": 5242301, "inode_total": 5242304, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617118720, "size_total": 10725752832, "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592070, "block_size": 4096, "block_total": 2618592, "block_used": 26522, "device": "/dev/md127", "fstype": "xfs", "inode_available": 5242301, "inode_total": 5242304, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617118720, "size_total": 10725752832, "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.042) 0:00:27.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.039) 0:00:27.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.037) 0:00:27.545 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.038) 0:00:27.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.031) 0:00:27.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.035) 0:00:27.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.032) 0:00:27.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:38:31 +0000 (0:00:00.032) 0:00:27.716 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=3517ecef-b308-482c-8907-7eb83fc04137 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.054) 0:00:27.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.037) 0:00:27.809 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.041) 0:00:27.850 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.034) 0:00:27.884 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.031) 0:00:27.916 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.039) 0:00:27.955 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.042) 0:00:27.998 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105105.1141214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105105.1141214, "dev": 5, "device_type": 2431, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 23860, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105105.1141214, "nlink": 1, "path": "/dev/md/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.438) 0:00:28.436 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.038) 0:00:28.475 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.037) 0:00:28.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.037) 0:00:28.550 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "raid1" }, "changed": false } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.038) 0:00:28.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.039) 0:00:28.628 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.036) 0:00:28.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:38:32 +0000 (0:00:00.033) 0:00:28.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.035) 0:00:28.734 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.052) 0:00:28.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.045) 0:00:28.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.034) 0:00:28.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.032) 0:00:28.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.032) 0:00:28.932 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.042) 0:00:28.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.044) 0:00:29.019 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.040) 0:00:29.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.031) 0:00:29.091 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.032) 0:00:29.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.030) 0:00:29.154 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.033) 0:00:29.187 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "mdadm", "--detail", "/dev/md/test1" ], "delta": "0:00:00.009461", "end": "2022-06-01 13:38:33.177176", "rc": 0, "start": "2022-06-01 13:38:33.167715" } STDOUT: /dev/md/test1: Version : 1.0 Creation Time : Wed Jun 1 13:38:19 2022 Raid Level : raid1 Array Size : 10484608 (10.00 GiB 10.74 GB) Used Dev Size : 10484608 (10.00 GiB 10.74 GB) Raid Devices : 2 Total Devices : 3 Persistence : Superblock is persistent Intent Bitmap : Internal Update Time : Wed Jun 1 13:38:33 2022 State : clean, resyncing Active Devices : 2 Working Devices : 3 Failed Devices : 0 Spare Devices : 1 Consistency Policy : bitmap Resync Status : 26% complete Name : test1 UUID : cea8517f:119e6117:5a2dac1a:99857af3 Events : 9 Number Major Minor RaidDevice State 0 8 1 0 active sync /dev/sda1 1 8 17 1 active sync /dev/sdb1 2 8 33 - spare /dev/sdc1 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.416) 0:00:29.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": "Active\\ Devices\\ \\:\\ 2\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.038) 0:00:29.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_spare_devices_re": "Spare\\ Devices\\ \\:\\ 1\\\n" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:38:33 +0000 (0:00:00.043) 0:00:29.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_metadata_version_re": "Version\\ \\:\\ 1\\.0\\\n" }, "changed": false } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.046) 0:00:29.732 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.041) 0:00:29.773 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.042) 0:00:29.816 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.041) 0:00:29.857 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.531) 0:00:30.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.033) 0:00:30.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.032) 0:00:30.454 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.035) 0:00:30.490 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.033) 0:00:30.524 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.031) 0:00:30.555 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.033) 0:00:30.589 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.032) 0:00:30.622 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.034) 0:00:30.657 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:38:34 +0000 (0:00:00.037) 0:00:30.694 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.037) 0:00:30.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.032) 0:00:30.764 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.032) 0:00:30.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.031) 0:00:30.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.033) 0:00:30.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.029) 0:00:30.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.034) 0:00:30.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.031) 0:00:30.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.037) 0:00:30.995 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.034) 0:00:31.030 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Re-run the same invocation without the RAID params] ********************** task path: /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:42 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.033) 0:00:31.063 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.069) 0:00:31.132 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:38:35 +0000 (0:00:00.044) 0:00:31.177 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.551) 0:00:31.728 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.076) 0:00:31.805 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.033) 0:00:31.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.033) 0:00:31.872 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.065) 0:00:31.937 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.025) 0:00:31.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.031) 0:00:31.994 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.033) 0:00:32.028 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "sda", "sdb", "sdc" ], "mount_point": "/opt/test1", "name": "test1", "state": "present", "type": "raid" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.037) 0:00:32.065 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.031) 0:00:32.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.031) 0:00:32.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.030) 0:00:32.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.033) 0:00:32.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.051) 0:00:32.244 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:38:36 +0000 (0:00:00.030) 0:00:32.275 ******** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: blivet.errors.DeviceError: RAID level None is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level level = levels.raid_level(value) File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level raise RaidError("invalid RAID level descriptor %s" % descriptor) blivet.errors.RaidError: invalid RAID level descriptor None During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level level = self._get_level(value, self._levels) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level raise ValueError(message % {"raid_level": value, "levels": choices}) ValueError: RAID level None is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.9/runpy.py", line 210, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray return MDRaidArrayDevice(name, *args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__ raise e File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__ self.level = level File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock return m(*args, **kwargs) File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level raise errors.DeviceError(e) blivet.errors.DeviceError: RAID level None is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0). MODULE_STDERR: Shared connection to 127.0.0.3 closed. TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:38:38 +0000 (0:00:01.703) 0:00:33.978 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'exception': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor None\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level None is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level None is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'_ansible_no_log': False, u'module_stderr': u'Shared connection to 127.0.0.3 closed.\r\n', u'changed': False, u'module_stdout': u'Traceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 77, in _get_level\r\n level = levels.raid_level(value)\r\n File "/usr/lib/python3.9/site-packages/blivet/devicelibs/raid.py", line 377, in raid_level\r\n raise RaidError("invalid RAID level descriptor %s" % descriptor)\r\nblivet.errors.RaidError: invalid RAID level descriptor None\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 195, in level\r\n level = self._get_level(value, self._levels)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/raid.py", line 81, in _get_level\r\n raise ValueError(message % {"raid_level": value, "levels": choices})\r\nValueError: RAID level None is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 102, in \r\n _ansiballz_main()\r\n File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/root/.ansible/tmp/ansible-tmp-1654105116.63-144035-34504810131107/AnsiballZ_blivet.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.9/runpy.py", line 210, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1717, in \r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1713, in main\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1675, in run_module\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 1311, in manage_volume\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 559, in manage\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 928, in _create\r\n File "/tmp/ansible_blivet_payload_q_sf9zqj/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 296, in _new_mdarray\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/blivet.py", line 528, in new_mdarray\r\n return MDRaidArrayDevice(name, *args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 132, in __init__\r\n raise e\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 126, in __init__\r\n self.level = level\r\n File "/usr/lib/python3.9/site-packages/blivet/threads.py", line 53, in run_with_lock\r\n return m(*args, **kwargs)\r\n File "/usr/lib/python3.9/site-packages/blivet/devices/md.py", line 197, in level\r\n raise errors.DeviceError(e)\r\nblivet.errors.DeviceError: RAID level None is an invalid value. Must be one of (raid10, raid1, raid4, linear, raid5, raid6, raid0).\r\n', u'failed': True, u'rc': 1, u'msg': u'MODULE FAILURE\nSee stdout/stderr for the exact error'} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:38:38 +0000 (0:00:00.036) 0:00:34.015 ******** PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=107 changed=2 unreachable=0 failed=1 skipped=64 rescued=1 ignored=0 Wednesday 01 June 2022 17:38:38 +0000 (0:00:00.013) 0:00:34.028 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.25s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.31s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.17s /tmp/tmp7247_7fr/tests/tests_raid_volume_options_scsi_generated.yml:3 --------- linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.97s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.86s /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml:2 ------------------------ linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : get required packages ---------------------- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.57s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Collect info about the volumes. ----------------------------------------- 0.57s /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 ----------------------------- Read the /etc/fstab file for volume existence --------------------------- 0.56s /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 ----------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : set up new/current mounts ------------------ 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:38:39 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:38:40 +0000 (0:00:01.384) 0:00:01.407 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_remove_mount.yml *********************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_remove_mount.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:2 Wednesday 01 June 2022 17:38:40 +0000 (0:00:00.015) 0:00:01.423 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:12 Wednesday 01 June 2022 17:38:41 +0000 (0:00:01.119) 0:00:02.542 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:38:41 +0000 (0:00:00.040) 0:00:02.583 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:38:41 +0000 (0:00:00.158) 0:00:02.741 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:38:42 +0000 (0:00:00.554) 0:00:03.296 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:38:42 +0000 (0:00:00.083) 0:00:03.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:38:42 +0000 (0:00:00.023) 0:00:03.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:38:42 +0000 (0:00:00.023) 0:00:03.426 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:38:42 +0000 (0:00:00.204) 0:00:03.631 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:38:42 +0000 (0:00:00.018) 0:00:03.649 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:38:43 +0000 (0:00:01.185) 0:00:04.835 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:38:43 +0000 (0:00:00.047) 0:00:04.883 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:38:44 +0000 (0:00:00.047) 0:00:04.930 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:38:44 +0000 (0:00:00.693) 0:00:05.623 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:38:44 +0000 (0:00:00.081) 0:00:05.705 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:38:44 +0000 (0:00:00.022) 0:00:05.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:38:44 +0000 (0:00:00.025) 0:00:05.752 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:38:44 +0000 (0:00:00.024) 0:00:05.777 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:38:45 +0000 (0:00:00.839) 0:00:06.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:38:47 +0000 (0:00:01.965) 0:00:08.582 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:38:47 +0000 (0:00:00.047) 0:00:08.629 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:38:47 +0000 (0:00:00.061) 0:00:08.691 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.575) 0:00:09.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.034) 0:00:09.301 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.028) 0:00:09.330 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.036) 0:00:09.366 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.035) 0:00:09.402 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.034) 0:00:09.436 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.028) 0:00:09.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.028) 0:00:09.494 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.030) 0:00:09.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:38:48 +0000 (0:00:00.033) 0:00:09.558 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:38:49 +0000 (0:00:00.507) 0:00:10.066 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:38:49 +0000 (0:00:00.029) 0:00:10.096 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:15 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.892) 0:00:10.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:22 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.032) 0:00:11.021 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.044) 0:00:11.066 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "vdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.548) 0:00:11.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "vdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.037) 0:00:11.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.031) 0:00:11.683 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "vdb" ] } TASK [Create a LVM logical volume mounted at "/opt/test1"] ********************* task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:27 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.037) 0:00:11.720 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.058) 0:00:11.778 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:38:50 +0000 (0:00:00.049) 0:00:11.828 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.528) 0:00:12.357 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.072) 0:00:12.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.031) 0:00:12.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.032) 0:00:12.493 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.060) 0:00:12.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.025) 0:00:12.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.033) 0:00:12.613 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "3g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.040) 0:00:12.654 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.035) 0:00:12.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.031) 0:00:12.722 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.031) 0:00:12.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.031) 0:00:12.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.036) 0:00:12.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.055) 0:00:12.876 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:38:51 +0000 (0:00:00.030) 0:00:12.906 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:38:54 +0000 (0:00:02.367) 0:00:15.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:38:54 +0000 (0:00:00.034) 0:00:15.307 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:38:54 +0000 (0:00:00.030) 0:00:15.338 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "lvm2", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:38:54 +0000 (0:00:00.041) 0:00:15.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:38:54 +0000 (0:00:00.037) 0:00:15.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:38:54 +0000 (0:00:00.032) 0:00:15.449 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:38:54 +0000 (0:00:00.030) 0:00:15.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:38:55 +0000 (0:00:01.006) 0:00:16.486 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:38:59 +0000 (0:00:03.836) 0:00:20.323 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:39:00 +0000 (0:00:00.702) 0:00:21.025 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:39:00 +0000 (0:00:00.417) 0:00:21.442 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:39:00 +0000 (0:00:00.031) 0:00:21.474 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:39 Wednesday 01 June 2022 17:39:01 +0000 (0:00:00.979) 0:00:22.454 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:39:01 +0000 (0:00:00.052) 0:00:22.506 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:39:01 +0000 (0:00:00.040) 0:00:22.547 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:39:01 +0000 (0:00:00.036) 0:00:22.583 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "3G", "type": "lvm", "uuid": "68285142-501b-45b0-aaf9-64142822000b" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "rOvrMH-V6xS-0WNd-O1Fx-x0eL-12DL-ZBOUpV" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:39:04 +0000 (0:00:02.609) 0:00:25.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003058", "end": "2022-06-01 13:39:04.099343", "rc": 0, "start": "2022-06-01 13:39:04.096285" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:39:04 +0000 (0:00:00.531) 0:00:25.724 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003512", "end": "2022-06-01 13:39:04.502111", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:39:04.498599" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:39:05 +0000 (0:00:00.407) 0:00:26.132 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:39:05 +0000 (0:00:00.072) 0:00:26.204 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:39:05 +0000 (0:00:00.039) 0:00:26.243 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:39:05 +0000 (0:00:00.067) 0:00:26.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:39:05 +0000 (0:00:00.043) 0:00:26.354 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:39:05 +0000 (0:00:00.560) 0:00:26.915 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.043) 0:00:26.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.040) 0:00:26.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.040) 0:00:27.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.040) 0:00:27.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.032) 0:00:27.112 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.044) 0:00:27.156 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.059) 0:00:27.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.035) 0:00:27.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.070) 0:00:27.321 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.033) 0:00:27.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.032) 0:00:27.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.032) 0:00:27.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.030) 0:00:27.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.033) 0:00:27.484 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.036) 0:00:27.521 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.061) 0:00:27.583 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.066) 0:00:27.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.032) 0:00:27.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.031) 0:00:27.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.033) 0:00:27.748 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.067) 0:00:27.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.037) 0:00:27.853 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:39:06 +0000 (0:00:00.036) 0:00:27.889 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.059) 0:00:27.948 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.038) 0:00:27.987 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.040) 0:00:28.027 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.032) 0:00:28.060 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.031) 0:00:28.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.032) 0:00:28.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.032) 0:00:28.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.033) 0:00:28.190 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.070) 0:00:28.260 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.071) 0:00:28.332 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.034) 0:00:28.366 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.033) 0:00:28.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.033) 0:00:28.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.034) 0:00:28.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.033) 0:00:28.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.036) 0:00:28.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.032) 0:00:28.571 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.033) 0:00:28.604 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.034) 0:00:28.638 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.065) 0:00:28.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.038) 0:00:28.743 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:39:07 +0000 (0:00:00.161) 0:00:28.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:39:08 +0000 (0:00:00.043) 0:00:28.947 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2592070, "block_size": 4096, "block_total": 2618592, "block_used": 26522, "device": "/dev/md127", "fstype": "xfs", "inode_available": 5242301, "inode_total": 5242304, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 10617118720, "size_total": 10725752832, "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:39:08 +0000 (0:00:00.046) 0:00:28.994 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "assertion": false, "changed": false, "evaluated_to": false } MSG: Found unexpected mount state for volume 'test1' device PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=88 changed=2 unreachable=0 failed=1 skipped=47 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:08 +0000 (0:00:00.024) 0:00:29.019 ******** =============================================================================== linux-system-roles.storage : set up new/current mounts ------------------ 3.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Collect info about the volumes. ----------------------------------------- 2.61s /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 ----------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.37s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.97s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.19s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.12s /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:2 ------------------------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.98s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.58s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Get the canonical device path for each member device -------------------- 0.56s /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 ------------------------ linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Find unused disks in the system ----------------------------------------- 0.55s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- Read the /etc/fstab file for volume existence --------------------------- 0.53s /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 ----------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:39:08 +0000 (0:00:00.024) 0:00:00.024 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:10 +0000 (0:00:01.378) 0:00:01.402 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_remove_mount_nvme_generated.yml ******************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_remove_mount_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:10 +0000 (0:00:00.018) 0:00:01.421 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:39:11 +0000 (0:00:00.026) 0:00:00.026 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:12 +0000 (0:00:01.319) 0:00:01.346 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_remove_mount_scsi_generated.yml ******************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_remove_mount_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_mount_scsi_generated.yml:3 Wednesday 01 June 2022 17:39:12 +0000 (0:00:00.016) 0:00:01.362 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_mount_scsi_generated.yml:7 Wednesday 01 June 2022 17:39:13 +0000 (0:00:01.179) 0:00:02.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:2 Wednesday 01 June 2022 17:39:13 +0000 (0:00:00.028) 0:00:02.569 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:12 Wednesday 01 June 2022 17:39:14 +0000 (0:00:00.838) 0:00:03.408 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:39:14 +0000 (0:00:00.041) 0:00:03.449 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:39:14 +0000 (0:00:00.157) 0:00:03.607 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:39:15 +0000 (0:00:00.535) 0:00:04.142 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:39:15 +0000 (0:00:00.079) 0:00:04.221 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:39:15 +0000 (0:00:00.025) 0:00:04.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:39:15 +0000 (0:00:00.033) 0:00:04.281 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:39:15 +0000 (0:00:00.200) 0:00:04.481 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:39:15 +0000 (0:00:00.021) 0:00:04.502 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:39:16 +0000 (0:00:01.065) 0:00:05.568 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:39:16 +0000 (0:00:00.049) 0:00:05.618 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:39:16 +0000 (0:00:00.054) 0:00:05.672 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:39:17 +0000 (0:00:00.691) 0:00:06.364 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:39:17 +0000 (0:00:00.084) 0:00:06.448 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:39:17 +0000 (0:00:00.023) 0:00:06.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:39:17 +0000 (0:00:00.025) 0:00:06.498 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:39:17 +0000 (0:00:00.023) 0:00:06.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:39:18 +0000 (0:00:00.870) 0:00:07.392 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:39:20 +0000 (0:00:01.854) 0:00:09.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:39:20 +0000 (0:00:00.043) 0:00:09.290 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:39:20 +0000 (0:00:00.029) 0:00:09.320 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:39:20 +0000 (0:00:00.555) 0:00:09.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:39:20 +0000 (0:00:00.032) 0:00:09.908 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:39:20 +0000 (0:00:00.030) 0:00:09.938 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:39:20 +0000 (0:00:00.033) 0:00:09.972 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.036) 0:00:10.009 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.035) 0:00:10.044 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.030) 0:00:10.075 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.029) 0:00:10.104 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.030) 0:00:10.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.029) 0:00:10.164 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.527) 0:00:10.692 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:39:21 +0000 (0:00:00.030) 0:00:10.723 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:15 Wednesday 01 June 2022 17:39:22 +0000 (0:00:00.934) 0:00:11.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:22 Wednesday 01 June 2022 17:39:22 +0000 (0:00:00.031) 0:00:11.689 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:39:22 +0000 (0:00:00.047) 0:00:11.736 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": "Unable to find unused disk" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:39:23 +0000 (0:00:00.562) 0:00:12.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:39:23 +0000 (0:00:00.032) 0:00:12.332 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: Unable to find enough unused disks. Exiting playbook. PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=27 changed=0 unreachable=0 failed=1 skipped=13 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:23 +0000 (0:00:00.019) 0:00:12.352 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.18s /tmp/tmp7247_7fr/tests/tests_remove_mount_scsi_generated.yml:3 ---------------- linux-system-roles.storage : make sure blivet is available -------------- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.84s /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:2 ------------------------------- linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Find unused disks in the system ----------------------------------------- 0.56s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 include_tasks ----------------------------------------------------------- 0.05s /tmp/tmp7247_7fr/tests/tests_remove_mount.yml:22 ------------------------------ linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.04s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:39:24 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:25 +0000 (0:00:01.363) 0:00:01.386 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_remove_nonexistent_pool.yml ************************************ 1 plays in /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:2 Wednesday 01 June 2022 17:39:25 +0000 (0:00:00.012) 0:00:01.398 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:14 Wednesday 01 June 2022 17:39:26 +0000 (0:00:01.135) 0:00:02.534 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:39:26 +0000 (0:00:00.049) 0:00:02.584 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:39:26 +0000 (0:00:00.164) 0:00:02.748 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:39:27 +0000 (0:00:00.574) 0:00:03.322 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:39:27 +0000 (0:00:00.080) 0:00:03.402 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:39:27 +0000 (0:00:00.024) 0:00:03.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:39:27 +0000 (0:00:00.025) 0:00:03.452 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:39:27 +0000 (0:00:00.198) 0:00:03.651 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:39:27 +0000 (0:00:00.020) 0:00:03.671 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:39:28 +0000 (0:00:01.118) 0:00:04.789 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:39:28 +0000 (0:00:00.049) 0:00:04.838 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:39:28 +0000 (0:00:00.048) 0:00:04.887 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:39:29 +0000 (0:00:00.719) 0:00:05.606 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:39:29 +0000 (0:00:00.084) 0:00:05.690 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:39:29 +0000 (0:00:00.021) 0:00:05.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:39:29 +0000 (0:00:00.022) 0:00:05.734 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:39:29 +0000 (0:00:00.022) 0:00:05.757 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:39:30 +0000 (0:00:00.840) 0:00:06.597 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:39:32 +0000 (0:00:01.880) 0:00:08.478 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:39:32 +0000 (0:00:00.045) 0:00:08.523 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:39:32 +0000 (0:00:00.028) 0:00:08.552 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.575) 0:00:09.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.031) 0:00:09.159 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.029) 0:00:09.188 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.036) 0:00:09.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.035) 0:00:09.260 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.037) 0:00:09.297 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.029) 0:00:09.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.029) 0:00:09.356 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.030) 0:00:09.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:39:33 +0000 (0:00:00.030) 0:00:09.417 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:39:34 +0000 (0:00:00.507) 0:00:09.925 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:39:34 +0000 (0:00:00.030) 0:00:09.955 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:17 Wednesday 01 June 2022 17:39:35 +0000 (0:00:01.179) 0:00:11.135 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:24 Wednesday 01 June 2022 17:39:35 +0000 (0:00:00.037) 0:00:11.173 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:39:35 +0000 (0:00:00.047) 0:00:11.221 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "vdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:39:35 +0000 (0:00:00.524) 0:00:11.745 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "vdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:39:35 +0000 (0:00:00.037) 0:00:11.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:39:35 +0000 (0:00:00.032) 0:00:11.815 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "vdc" ] } TASK [Removing nonexistent pool (with listed volumes)] ************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:28 Wednesday 01 June 2022 17:39:35 +0000 (0:00:00.036) 0:00:11.852 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.058) 0:00:11.910 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.046) 0:00:11.957 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.539) 0:00:12.496 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.081) 0:00:12.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.063) 0:00:12.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.032) 0:00:12.674 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.064) 0:00:12.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.029) 0:00:12.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.031) 0:00:12.799 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdc" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.035) 0:00:12.834 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.033) 0:00:12.868 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:39:36 +0000 (0:00:00.029) 0:00:12.897 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:39:37 +0000 (0:00:00.030) 0:00:12.927 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:39:37 +0000 (0:00:00.035) 0:00:12.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:39:37 +0000 (0:00:00.031) 0:00:12.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:39:37 +0000 (0:00:00.044) 0:00:13.039 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:39:37 +0000 (0:00:00.030) 0:00:13.069 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:39:39 +0000 (0:00:02.333) 0:00:15.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:39:39 +0000 (0:00:00.033) 0:00:15.436 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:39:39 +0000 (0:00:00.030) 0:00:15.466 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:39:39 +0000 (0:00:00.042) 0:00:15.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:39:39 +0000 (0:00:00.039) 0:00:15.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:39:39 +0000 (0:00:00.035) 0:00:15.584 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:39:40 +0000 (0:00:00.550) 0:00:16.134 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:39:41 +0000 (0:00:00.957) 0:00:17.092 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:39:41 +0000 (0:00:00.029) 0:00:17.121 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:39:41 +0000 (0:00:00.680) 0:00:17.802 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:39:42 +0000 (0:00:00.400) 0:00:18.202 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:39:42 +0000 (0:00:00.028) 0:00:18.230 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:40 Wednesday 01 June 2022 17:39:43 +0000 (0:00:00.936) 0:00:19.167 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:39:43 +0000 (0:00:00.085) 0:00:19.253 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:39:43 +0000 (0:00:00.040) 0:00:19.294 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:39:43 +0000 (0:00:00.030) 0:00:19.324 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:39:43 +0000 (0:00:00.497) 0:00:19.821 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002979", "end": "2022-06-01 13:39:43.691914", "rc": 0, "start": "2022-06-01 13:39:43.688935" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:39:44 +0000 (0:00:00.482) 0:00:20.304 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002888", "end": "2022-06-01 13:39:44.073959", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:39:44.071071" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:39:44 +0000 (0:00:00.389) 0:00:20.693 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:39:44 +0000 (0:00:00.075) 0:00:20.769 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:39:44 +0000 (0:00:00.032) 0:00:20.802 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:39:44 +0000 (0:00:00.066) 0:00:20.869 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.041) 0:00:20.910 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.030) 0:00:20.941 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.029) 0:00:20.970 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.038) 0:00:21.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.038) 0:00:21.047 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.045) 0:00:21.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.031) 0:00:21.124 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.029) 0:00:21.154 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.068) 0:00:21.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.034) 0:00:21.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.036) 0:00:21.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.032) 0:00:21.327 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.030) 0:00:21.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.029) 0:00:21.387 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.029) 0:00:21.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.031) 0:00:21.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.029) 0:00:21.477 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.059) 0:00:21.536 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.064) 0:00:21.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.030) 0:00:21.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.028) 0:00:21.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.028) 0:00:21.689 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.062) 0:00:21.751 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.037) 0:00:21.789 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.030) 0:00:21.820 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.031) 0:00:21.851 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:39:45 +0000 (0:00:00.032) 0:00:21.884 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.105) 0:00:21.989 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.067) 0:00:22.057 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.032) 0:00:22.089 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.034) 0:00:22.123 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.033) 0:00:22.156 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.031) 0:00:22.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.030) 0:00:22.218 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.030) 0:00:22.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.031) 0:00:22.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.035) 0:00:22.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.032) 0:00:22.348 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.064) 0:00:22.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.035) 0:00:22.448 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.144) 0:00:22.593 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.035) 0:00:22.629 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.041) 0:00:22.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.030) 0:00:22.700 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.034) 0:00:22.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.033) 0:00:22.768 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.031) 0:00:22.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.030) 0:00:22.830 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.031) 0:00:22.861 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:39:46 +0000 (0:00:00.033) 0:00:22.895 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.048) 0:00:22.944 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.028) 0:00:22.972 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.038) 0:00:23.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.030) 0:00:23.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.031) 0:00:23.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.031) 0:00:23.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.032) 0:00:23.137 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.393) 0:00:23.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.043) 0:00:23.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.028) 0:00:23.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.035) 0:00:23.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.030) 0:00:23.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.025) 0:00:23.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.032) 0:00:23.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.032) 0:00:23.759 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.038) 0:00:23.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.030) 0:00:23.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.038) 0:00:23.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:39:47 +0000 (0:00:00.032) 0:00:23.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.035) 0:00:23.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.032) 0:00:23.968 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.032) 0:00:24.000 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.037) 0:00:24.038 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.033) 0:00:24.072 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.028) 0:00:24.101 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.033) 0:00:24.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.081) 0:00:24.216 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.030) 0:00:24.247 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.029) 0:00:24.276 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.031) 0:00:24.308 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.030) 0:00:24.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.029) 0:00:24.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.032) 0:00:24.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.031) 0:00:24.433 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.032) 0:00:24.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.032) 0:00:24.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.031) 0:00:24.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.033) 0:00:24.562 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.039) 0:00:24.601 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.033) 0:00:24.634 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.033) 0:00:24.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.032) 0:00:24.700 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.031) 0:00:24.732 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.029) 0:00:24.762 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.039) 0:00:24.801 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.034) 0:00:24.835 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.030) 0:00:24.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:39:48 +0000 (0:00:00.033) 0:00:24.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.034) 0:00:24.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.033) 0:00:24.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.035) 0:00:25.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.030) 0:00:25.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.030) 0:00:25.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.032) 0:00:25.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.036) 0:00:25.133 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.033) 0:00:25.166 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.034) 0:00:25.200 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Removing nonexistent pool] *********************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:42 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.034) 0:00:25.234 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.064) 0:00:25.299 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.049) 0:00:25.348 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:39:49 +0000 (0:00:00.560) 0:00:25.909 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.074) 0:00:25.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.033) 0:00:26.017 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.031) 0:00:26.048 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.062) 0:00:26.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.028) 0:00:26.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.033) 0:00:26.172 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdc" ], "name": "foo", "state": "absent" } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.036) 0:00:26.208 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.035) 0:00:26.244 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.031) 0:00:26.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.030) 0:00:26.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.030) 0:00:26.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.034) 0:00:26.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.061) 0:00:26.431 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:39:50 +0000 (0:00:00.045) 0:00:26.477 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:39:52 +0000 (0:00:01.755) 0:00:28.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.032) 0:00:28.265 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.030) 0:00:28.295 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.041) 0:00:28.337 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.043) 0:00:28.380 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.040) 0:00:28.421 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.034) 0:00:28.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.036) 0:00:28.492 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.033) 0:00:28.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:39:52 +0000 (0:00:00.033) 0:00:28.560 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:39:53 +0000 (0:00:00.439) 0:00:29.000 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:39:53 +0000 (0:00:00.036) 0:00:29.036 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:51 Wednesday 01 June 2022 17:39:54 +0000 (0:00:00.911) 0:00:29.947 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:39:54 +0000 (0:00:00.056) 0:00:30.004 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:39:54 +0000 (0:00:00.040) 0:00:30.044 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:39:54 +0000 (0:00:00.031) 0:00:30.076 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:39:54 +0000 (0:00:00.405) 0:00:30.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003574", "end": "2022-06-01 13:39:54.270284", "rc": 0, "start": "2022-06-01 13:39:54.266710" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:39:54 +0000 (0:00:00.404) 0:00:30.886 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003688", "end": "2022-06-01 13:39:54.686216", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:39:54.682528" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.414) 0:00:31.300 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.061) 0:00:31.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.033) 0:00:31.395 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.066) 0:00:31.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.041) 0:00:31.502 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.030) 0:00:31.533 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.029) 0:00:31.562 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.040) 0:00:31.603 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.084) 0:00:31.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.039) 0:00:31.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.033) 0:00:31.760 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.030) 0:00:31.791 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.059) 0:00:31.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:39:55 +0000 (0:00:00.035) 0:00:31.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.036) 0:00:31.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.036) 0:00:31.959 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.031) 0:00:31.990 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.030) 0:00:32.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.034) 0:00:32.055 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.032) 0:00:32.088 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.032) 0:00:32.121 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.061) 0:00:32.182 ******** TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.030) 0:00:32.212 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.064) 0:00:32.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.037) 0:00:32.314 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.030) 0:00:32.345 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.031) 0:00:32.376 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.035) 0:00:32.411 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.068) 0:00:32.479 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.030) 0:00:32.510 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.033) 0:00:32.544 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.034) 0:00:32.578 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.033) 0:00:32.611 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.028) 0:00:32.640 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=136 changed=2 unreachable=0 failed=0 skipped=128 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:56 +0000 (0:00:00.017) 0:00:32.657 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.33s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.76s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : Update facts ------------------------------- 1.18s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 1.14s /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:2 -------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.12s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.96s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.94s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.68s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.58s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.57s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : remove obsolete mounts --------------------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Find unused disks in the system ----------------------------------------- 0.52s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.51s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:39:57 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:58 +0000 (0:00:01.334) 0:00:01.357 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_remove_nonexistent_pool_nvme_generated.yml ********************* 2 plays in /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:39:58 +0000 (0:00:00.014) 0:00:01.372 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.33s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:39:59 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:40:00 +0000 (0:00:01.360) 0:00:01.383 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_remove_nonexistent_pool_scsi_generated.yml ********************* 2 plays in /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool_scsi_generated.yml:3 Wednesday 01 June 2022 17:40:01 +0000 (0:00:00.013) 0:00:01.397 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool_scsi_generated.yml:7 Wednesday 01 June 2022 17:40:02 +0000 (0:00:01.138) 0:00:02.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:2 Wednesday 01 June 2022 17:40:02 +0000 (0:00:00.024) 0:00:02.559 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:14 Wednesday 01 June 2022 17:40:03 +0000 (0:00:00.846) 0:00:03.406 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:40:03 +0000 (0:00:00.048) 0:00:03.454 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:40:03 +0000 (0:00:00.162) 0:00:03.616 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:40:03 +0000 (0:00:00.548) 0:00:04.165 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:40:03 +0000 (0:00:00.081) 0:00:04.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:40:03 +0000 (0:00:00.025) 0:00:04.272 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:40:03 +0000 (0:00:00.024) 0:00:04.297 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:40:04 +0000 (0:00:00.208) 0:00:04.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:40:04 +0000 (0:00:00.019) 0:00:04.524 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:40:05 +0000 (0:00:01.091) 0:00:05.616 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:40:05 +0000 (0:00:00.053) 0:00:05.670 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:40:05 +0000 (0:00:00.049) 0:00:05.719 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:40:06 +0000 (0:00:00.715) 0:00:06.434 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:40:06 +0000 (0:00:00.081) 0:00:06.516 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:40:06 +0000 (0:00:00.022) 0:00:06.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:40:06 +0000 (0:00:00.056) 0:00:06.595 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:40:06 +0000 (0:00:00.021) 0:00:06.617 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:40:07 +0000 (0:00:00.843) 0:00:07.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:40:08 +0000 (0:00:01.829) 0:00:09.289 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:40:08 +0000 (0:00:00.045) 0:00:09.335 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:40:08 +0000 (0:00:00.029) 0:00:09.364 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.586) 0:00:09.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.031) 0:00:09.981 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.028) 0:00:10.010 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.033) 0:00:10.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.033) 0:00:10.077 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.032) 0:00:10.109 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.028) 0:00:10.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.029) 0:00:10.167 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.027) 0:00:10.194 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:40:09 +0000 (0:00:00.030) 0:00:10.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:40:10 +0000 (0:00:00.526) 0:00:10.752 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:40:10 +0000 (0:00:00.028) 0:00:10.780 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:17 Wednesday 01 June 2022 17:40:11 +0000 (0:00:00.909) 0:00:11.690 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:24 Wednesday 01 June 2022 17:40:11 +0000 (0:00:00.058) 0:00:11.748 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:40:11 +0000 (0:00:00.046) 0:00:11.795 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": "Unable to find unused disk" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:40:11 +0000 (0:00:00.508) 0:00:12.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:40:11 +0000 (0:00:00.031) 0:00:12.335 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: Unable to find enough unused disks. Exiting playbook. PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=27 changed=0 unreachable=0 failed=1 skipped=13 rescued=0 ignored=0 Wednesday 01 June 2022 17:40:11 +0000 (0:00:00.019) 0:00:12.355 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.14s /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool_scsi_generated.yml:3 ----- linux-system-roles.storage : make sure blivet is available -------------- 1.09s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.85s /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:2 -------------------- linux-system-roles.storage : make sure required packages are installed --- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Find unused disks in the system ----------------------------------------- 0.51s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : include the appropriate provider tasks ----- 0.21s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- Mark tasks to be skipped ------------------------------------------------ 0.06s /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml:17 ------------------- linux-system-roles.storage : make sure COPR support packages are present --- 0.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:40:12 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:40:14 +0000 (0:00:01.341) 0:00:01.364 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_resize.yml ***************************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_resize.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:1 Wednesday 01 June 2022 17:40:14 +0000 (0:00:00.038) 0:00:01.402 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:20 Wednesday 01 June 2022 17:40:15 +0000 (0:00:01.110) 0:00:02.513 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:40:15 +0000 (0:00:00.046) 0:00:02.559 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:40:15 +0000 (0:00:00.158) 0:00:02.718 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:40:15 +0000 (0:00:00.553) 0:00:03.272 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:40:16 +0000 (0:00:00.078) 0:00:03.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:40:16 +0000 (0:00:00.024) 0:00:03.375 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:40:16 +0000 (0:00:00.022) 0:00:03.398 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:40:16 +0000 (0:00:00.199) 0:00:03.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:40:16 +0000 (0:00:00.020) 0:00:03.618 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:40:17 +0000 (0:00:01.112) 0:00:04.730 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:40:17 +0000 (0:00:00.048) 0:00:04.779 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:40:17 +0000 (0:00:00.053) 0:00:04.832 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:40:18 +0000 (0:00:00.732) 0:00:05.565 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:40:18 +0000 (0:00:00.083) 0:00:05.648 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:40:18 +0000 (0:00:00.021) 0:00:05.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:40:18 +0000 (0:00:00.021) 0:00:05.692 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:40:18 +0000 (0:00:00.020) 0:00:05.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:40:19 +0000 (0:00:00.876) 0:00:06.589 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:40:21 +0000 (0:00:01.864) 0:00:08.454 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.042) 0:00:08.497 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.025) 0:00:08.523 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.535) 0:00:09.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.029) 0:00:09.088 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.026) 0:00:09.114 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.031) 0:00:09.146 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.031) 0:00:09.178 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.033) 0:00:09.211 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.029) 0:00:09.241 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:40:21 +0000 (0:00:00.030) 0:00:09.272 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:40:22 +0000 (0:00:00.030) 0:00:09.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:40:22 +0000 (0:00:00.030) 0:00:09.332 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:40:22 +0000 (0:00:00.505) 0:00:09.838 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:40:22 +0000 (0:00:00.030) 0:00:09.869 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:23 Wednesday 01 June 2022 17:40:23 +0000 (0:00:00.867) 0:00:10.737 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:30 Wednesday 01 June 2022 17:40:23 +0000 (0:00:00.030) 0:00:10.767 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:40:23 +0000 (0:00:00.047) 0:00:10.815 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "vdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.551) 0:00:11.367 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "vdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.037) 0:00:11.404 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.031) 0:00:11.435 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "vdb" ] } TASK [Create one LVM logical volume with "5g" under one volume group] ********** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:37 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.036) 0:00:11.472 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.057) 0:00:11.529 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.048) 0:00:11.577 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.558) 0:00:12.135 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.069) 0:00:12.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:40:24 +0000 (0:00:00.059) 0:00:12.264 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.032) 0:00:12.297 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.062) 0:00:12.359 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.026) 0:00:12.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.035) 0:00:12.422 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.040) 0:00:12.462 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.035) 0:00:12.498 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.034) 0:00:12.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.032) 0:00:12.564 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.031) 0:00:12.596 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.033) 0:00:12.629 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.043) 0:00:12.673 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:40:25 +0000 (0:00:00.028) 0:00:12.701 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "dosfstools", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:40:27 +0000 (0:00:02.281) 0:00:14.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:40:27 +0000 (0:00:00.032) 0:00:15.014 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:40:27 +0000 (0:00:00.027) 0:00:15.042 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "dosfstools", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:40:27 +0000 (0:00:00.039) 0:00:15.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:40:27 +0000 (0:00:00.038) 0:00:15.121 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:40:27 +0000 (0:00:00.038) 0:00:15.160 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:40:27 +0000 (0:00:00.035) 0:00:15.196 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:40:28 +0000 (0:00:00.998) 0:00:16.194 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:40:29 +0000 (0:00:00.550) 0:00:16.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:40:30 +0000 (0:00:00.652) 0:00:17.397 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:40:30 +0000 (0:00:00.394) 0:00:17.791 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:40:30 +0000 (0:00:00.031) 0:00:17.822 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:52 Wednesday 01 June 2022 17:40:31 +0000 (0:00:00.947) 0:00:18.770 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:40:31 +0000 (0:00:00.091) 0:00:18.861 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:40:31 +0000 (0:00:00.039) 0:00:18.900 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:40:31 +0000 (0:00:00.031) 0:00:18.932 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "s1Oc6q-whLY-AaH3-36Xr-heSM-6Y5u-k6ISki" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:40:32 +0000 (0:00:00.515) 0:00:19.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003500", "end": "2022-06-01 13:40:31.945257", "rc": 0, "start": "2022-06-01 13:40:31.941757" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:40:32 +0000 (0:00:00.485) 0:00:19.933 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003069", "end": "2022-06-01 13:40:32.355135", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:40:32.352066" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.406) 0:00:20.339 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.066) 0:00:20.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.032) 0:00:20.439 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.079) 0:00:20.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.054) 0:00:20.572 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.528) 0:00:21.100 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.045) 0:00:21.146 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.040) 0:00:21.186 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.037) 0:00:21.224 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:40:33 +0000 (0:00:00.037) 0:00:21.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.030) 0:00:21.292 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.044) 0:00:21.337 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.060) 0:00:21.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.032) 0:00:21.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.031) 0:00:21.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.032) 0:00:21.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.034) 0:00:21.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.032) 0:00:21.562 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.034) 0:00:21.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.032) 0:00:21.630 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.032) 0:00:21.662 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.062) 0:00:21.725 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.059) 0:00:21.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.031) 0:00:21.816 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.032) 0:00:21.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.069) 0:00:21.917 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.066) 0:00:21.984 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.038) 0:00:22.022 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.045) 0:00:22.068 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.067) 0:00:22.136 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.041) 0:00:22.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.042) 0:00:22.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:40:34 +0000 (0:00:00.033) 0:00:22.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.032) 0:00:22.287 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.034) 0:00:22.321 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.033) 0:00:22.355 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.033) 0:00:22.388 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.067) 0:00:22.456 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.076) 0:00:22.532 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.038) 0:00:22.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.032) 0:00:22.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.033) 0:00:22.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.032) 0:00:22.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.032) 0:00:22.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.032) 0:00:22.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.032) 0:00:22.767 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.033) 0:00:22.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.035) 0:00:22.835 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.064) 0:00:22.899 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.041) 0:00:22.941 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.130) 0:00:23.071 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.038) 0:00:23.110 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1199010, "block_size": 4096, "block_total": 1268648, "block_used": 69638, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911144960, "size_total": 5196382208, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.045) 0:00:23.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.037) 0:00:23.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.036) 0:00:23.229 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:40:35 +0000 (0:00:00.040) 0:00:23.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.034) 0:00:23.304 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.032) 0:00:23.336 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.032) 0:00:23.369 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.033) 0:00:23.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.049) 0:00:23.452 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.043) 0:00:23.496 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.042) 0:00:23.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.032) 0:00:23.572 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.034) 0:00:23.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.041) 0:00:23.648 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.040) 0:00:23.689 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105226.9281216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105226.9281216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 24314, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105226.9281216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.422) 0:00:24.111 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.092) 0:00:24.204 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:40:36 +0000 (0:00:00.040) 0:00:24.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.037) 0:00:24.281 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.034) 0:00:24.316 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.037) 0:00:24.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.033) 0:00:24.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.030) 0:00:24.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.030) 0:00:24.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.037) 0:00:24.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.029) 0:00:24.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.029) 0:00:24.543 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.031) 0:00:24.575 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.030) 0:00:24.605 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.029) 0:00:24.634 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.035) 0:00:24.670 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.034) 0:00:24.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.031) 0:00:24.735 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.035) 0:00:24.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.032) 0:00:24.804 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.033) 0:00:24.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.032) 0:00:24.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.034) 0:00:24.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.034) 0:00:24.939 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.037) 0:00:24.977 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.033) 0:00:25.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.030) 0:00:25.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:40:37 +0000 (0:00:00.030) 0:00:25.072 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.498) 0:00:25.570 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.372) 0:00:25.942 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.037) 0:00:25.980 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.033) 0:00:26.014 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.030) 0:00:26.044 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.029) 0:00:26.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.030) 0:00:26.104 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.030) 0:00:26.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.030) 0:00:26.165 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.039) 0:00:26.205 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:40:38 +0000 (0:00:00.034) 0:00:26.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.042) 0:00:26.282 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.037043", "end": "2022-06-01 13:40:38.716818", "rc": 0, "start": "2022-06-01 13:40:38.679775" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.424) 0:00:26.706 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.044) 0:00:26.751 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.042) 0:00:26.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.035) 0:00:26.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.036) 0:00:26.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.034) 0:00:26.900 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.034) 0:00:26.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.034) 0:00:26.969 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.033) 0:00:27.002 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.028) 0:00:27.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change volume_size to "9g"] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:54 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.072) 0:00:27.103 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.067) 0:00:27.171 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:40:39 +0000 (0:00:00.050) 0:00:27.221 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.701) 0:00:27.923 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.074) 0:00:27.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.034) 0:00:28.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.033) 0:00:28.066 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.065) 0:00:28.131 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.030) 0:00:28.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.032) 0:00:28.195 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "9g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.038) 0:00:28.233 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:40:40 +0000 (0:00:00.035) 0:00:28.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:40:41 +0000 (0:00:00.032) 0:00:28.301 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:40:41 +0000 (0:00:00.033) 0:00:28.335 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:40:41 +0000 (0:00:00.036) 0:00:28.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:40:41 +0000 (0:00:00.032) 0:00:28.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:40:41 +0000 (0:00:00.047) 0:00:28.451 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:40:41 +0000 (0:00:00.029) 0:00:28.481 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "mdadm", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:40:43 +0000 (0:00:02.316) 0:00:30.798 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:40:43 +0000 (0:00:00.031) 0:00:30.829 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:40:43 +0000 (0:00:00.028) 0:00:30.857 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "mdadm", "dosfstools", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:40:43 +0000 (0:00:00.038) 0:00:30.896 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:40:43 +0000 (0:00:00.038) 0:00:30.935 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:40:43 +0000 (0:00:00.041) 0:00:30.976 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:40:43 +0000 (0:00:00.038) 0:00:31.015 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:40:44 +0000 (0:00:00.717) 0:00:31.732 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:40:44 +0000 (0:00:00.417) 0:00:32.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:40:45 +0000 (0:00:00.695) 0:00:32.845 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:40:45 +0000 (0:00:00.401) 0:00:33.247 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:40:46 +0000 (0:00:00.034) 0:00:33.281 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:68 Wednesday 01 June 2022 17:40:46 +0000 (0:00:00.868) 0:00:34.149 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:40:46 +0000 (0:00:00.058) 0:00:34.207 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:40:46 +0000 (0:00:00.040) 0:00:34.248 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:40:47 +0000 (0:00:00.037) 0:00:34.286 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "9G", "type": "lvm", "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "s1Oc6q-whLY-AaH3-36Xr-heSM-6Y5u-k6ISki" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:40:47 +0000 (0:00:00.420) 0:00:34.707 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002872", "end": "2022-06-01 13:40:47.112633", "rc": 0, "start": "2022-06-01 13:40:47.109761" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:40:47 +0000 (0:00:00.393) 0:00:35.100 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002935", "end": "2022-06-01 13:40:47.521687", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:40:47.518752" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.407) 0:00:35.508 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.066) 0:00:35.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.033) 0:00:35.607 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.106) 0:00:35.714 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.042) 0:00:35.756 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.420) 0:00:36.177 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.045) 0:00:36.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:40:48 +0000 (0:00:00.039) 0:00:36.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.038) 0:00:36.300 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.041) 0:00:36.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.031) 0:00:36.373 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.043) 0:00:36.417 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.067) 0:00:36.484 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.035) 0:00:36.519 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.033) 0:00:36.552 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.033) 0:00:36.586 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.035) 0:00:36.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.034) 0:00:36.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.037) 0:00:36.693 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.033) 0:00:36.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.034) 0:00:36.761 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.061) 0:00:36.822 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.065) 0:00:36.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.033) 0:00:36.920 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.033) 0:00:36.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.032) 0:00:36.986 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.066) 0:00:37.053 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.038) 0:00:37.092 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.038) 0:00:37.130 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.061) 0:00:37.191 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:40:49 +0000 (0:00:00.042) 0:00:37.234 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.045) 0:00:37.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.032) 0:00:37.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.033) 0:00:37.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.032) 0:00:37.378 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.036) 0:00:37.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.034) 0:00:37.449 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.065) 0:00:37.515 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.069) 0:00:37.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.033) 0:00:37.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.034) 0:00:37.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.033) 0:00:37.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.031) 0:00:37.719 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.033) 0:00:37.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.034) 0:00:37.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.037) 0:00:37.824 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.034) 0:00:37.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.034) 0:00:37.894 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.140) 0:00:38.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.038) 0:00:38.073 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.121) 0:00:38.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.036) 0:00:38.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2178069, "block_size": 4096, "block_total": 2305877, "block_used": 127808, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 589813, "inode_total": 589824, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 8921370624, "size_total": 9444872192, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2178069, "block_size": 4096, "block_total": 2305877, "block_used": 127808, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 589813, "inode_total": 589824, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 8921370624, "size_total": 9444872192, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:40:50 +0000 (0:00:00.045) 0:00:38.277 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.041) 0:00:38.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.038) 0:00:38.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.045) 0:00:38.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.032) 0:00:38.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.033) 0:00:38.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.034) 0:00:38.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.032) 0:00:38.537 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.046) 0:00:38.584 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.035) 0:00:38.619 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.036) 0:00:38.655 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.030) 0:00:38.686 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.035) 0:00:38.722 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.039) 0:00:38.761 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.039) 0:00:38.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105242.7481215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105242.7481215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 24314, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105242.7481215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.405) 0:00:39.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:40:51 +0000 (0:00:00.039) 0:00:39.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.037) 0:00:39.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.035) 0:00:39.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.033) 0:00:39.353 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.035) 0:00:39.389 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.032) 0:00:39.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.031) 0:00:39.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.029) 0:00:39.482 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.039) 0:00:39.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.034) 0:00:39.556 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.031) 0:00:39.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.032) 0:00:39.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.033) 0:00:39.654 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.033) 0:00:39.688 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.041) 0:00:39.729 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.042) 0:00:39.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.032) 0:00:39.804 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.036) 0:00:39.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.034) 0:00:39.875 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.036) 0:00:39.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.033) 0:00:39.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.039) 0:00:39.985 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.035) 0:00:40.021 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.031) 0:00:40.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.031) 0:00:40.084 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.034) 0:00:40.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:40:52 +0000 (0:00:00.035) 0:00:40.153 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 9663676416, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.408) 0:00:40.562 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 9663676416, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.415) 0:00:40.978 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "9663676416" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.039) 0:00:41.017 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "9663676416" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.035) 0:00:41.053 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.033) 0:00:41.086 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.030) 0:00:41.117 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.030) 0:00:41.148 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.030) 0:00:41.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.030) 0:00:41.208 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 9663676416, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:40:53 +0000 (0:00:00.039) 0:00:41.248 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "9663676416" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.034) 0:00:41.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.041) 0:00:41.324 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.041228", "end": "2022-06-01 13:40:53.779955", "rc": 0, "start": "2022-06-01 13:40:53.738727" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.445) 0:00:41.770 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.041) 0:00:41.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.041) 0:00:41.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.036) 0:00:41.890 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.036) 0:00:41.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.034) 0:00:41.961 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.036) 0:00:41.998 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.035) 0:00:42.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.033) 0:00:42.068 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.033) 0:00:42.101 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change volume size to "5g"] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:70 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.034) 0:00:42.136 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.070) 0:00:42.206 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:40:54 +0000 (0:00:00.052) 0:00:42.259 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.566) 0:00:42.825 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.078) 0:00:42.904 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.033) 0:00:42.937 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.034) 0:00:42.971 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.064) 0:00:43.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.027) 0:00:43.063 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.032) 0:00:43.096 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.038) 0:00:43.134 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.033) 0:00:43.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.030) 0:00:43.199 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.032) 0:00:43.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:40:55 +0000 (0:00:00.031) 0:00:43.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:40:56 +0000 (0:00:00.036) 0:00:43.299 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:40:56 +0000 (0:00:00.048) 0:00:43.347 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:40:56 +0000 (0:00:00.028) 0:00:43.376 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "dosfstools", "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:40:58 +0000 (0:00:02.300) 0:00:45.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:40:58 +0000 (0:00:00.031) 0:00:45.708 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:40:58 +0000 (0:00:00.031) 0:00:45.739 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "dosfstools", "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:40:58 +0000 (0:00:00.045) 0:00:45.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:40:58 +0000 (0:00:00.041) 0:00:45.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:40:58 +0000 (0:00:00.038) 0:00:45.864 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:40:58 +0000 (0:00:00.032) 0:00:45.897 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:40:59 +0000 (0:00:00.690) 0:00:46.587 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:40:59 +0000 (0:00:00.421) 0:00:47.008 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:41:00 +0000 (0:00:00.686) 0:00:47.695 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:41:00 +0000 (0:00:00.414) 0:00:48.110 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:41:00 +0000 (0:00:00.030) 0:00:48.140 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:84 Wednesday 01 June 2022 17:41:01 +0000 (0:00:00.947) 0:00:49.087 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:41:01 +0000 (0:00:00.062) 0:00:49.150 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:41:01 +0000 (0:00:00.039) 0:00:49.190 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:41:01 +0000 (0:00:00.035) 0:00:49.225 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "s1Oc6q-whLY-AaH3-36Xr-heSM-6Y5u-k6ISki" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:41:02 +0000 (0:00:00.414) 0:00:49.640 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002980", "end": "2022-06-01 13:41:02.057172", "rc": 0, "start": "2022-06-01 13:41:02.054192" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:41:02 +0000 (0:00:00.403) 0:00:50.044 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002916", "end": "2022-06-01 13:41:02.457758", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:41:02.454842" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.402) 0:00:50.447 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.118) 0:00:50.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.033) 0:00:50.599 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.066) 0:00:50.665 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.044) 0:00:50.709 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.401) 0:00:51.111 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.044) 0:00:51.156 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.041) 0:00:51.197 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.037) 0:00:51.235 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:41:03 +0000 (0:00:00.039) 0:00:51.274 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.033) 0:00:51.308 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.051) 0:00:51.359 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.061) 0:00:51.420 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.034) 0:00:51.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.035) 0:00:51.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.033) 0:00:51.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.033) 0:00:51.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.032) 0:00:51.591 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.047) 0:00:51.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.034) 0:00:51.673 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.037) 0:00:51.710 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.063) 0:00:51.773 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.064) 0:00:51.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.035) 0:00:51.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.034) 0:00:51.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.031) 0:00:51.939 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.076) 0:00:52.016 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.045) 0:00:52.062 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.043) 0:00:52.105 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.062) 0:00:52.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.039) 0:00:52.207 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:41:04 +0000 (0:00:00.041) 0:00:52.249 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.033) 0:00:52.282 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.032) 0:00:52.315 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.034) 0:00:52.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.033) 0:00:52.383 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.032) 0:00:52.416 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.079) 0:00:52.495 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.077) 0:00:52.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.035) 0:00:52.609 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.036) 0:00:52.646 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.037) 0:00:52.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.033) 0:00:52.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.032) 0:00:52.749 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.034) 0:00:52.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.036) 0:00:52.820 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.100) 0:00:52.921 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.035) 0:00:52.956 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.064) 0:00:53.020 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.038) 0:00:53.059 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.129) 0:00:53.188 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.037) 0:00:53.225 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1199011, "block_size": 4096, "block_total": 1273760, "block_used": 74749, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911149056, "size_total": 5217320960, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1199011, "block_size": 4096, "block_total": 1273760, "block_used": 74749, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4911149056, "size_total": 5217320960, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:41:05 +0000 (0:00:00.042) 0:00:53.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.042) 0:00:53.311 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.037) 0:00:53.348 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.048) 0:00:53.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.043) 0:00:53.440 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.035) 0:00:53.475 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.033) 0:00:53.509 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.032) 0:00:53.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.050) 0:00:53.591 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.043) 0:00:53.635 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.042) 0:00:53.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.033) 0:00:53.710 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.034) 0:00:53.745 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.042) 0:00:53.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.041) 0:00:53.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105257.6071215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105257.6071215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 24314, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105257.6071215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:41:06 +0000 (0:00:00.410) 0:00:54.239 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.043) 0:00:54.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.040) 0:00:54.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.036) 0:00:54.360 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.032) 0:00:54.392 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.038) 0:00:54.431 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.033) 0:00:54.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.034) 0:00:54.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.032) 0:00:54.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.039) 0:00:54.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.032) 0:00:54.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.032) 0:00:54.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.033) 0:00:54.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.036) 0:00:54.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.035) 0:00:54.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.053) 0:00:54.797 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.053) 0:00:54.851 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.033) 0:00:54.884 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.031) 0:00:54.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.033) 0:00:54.949 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.031) 0:00:54.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.030) 0:00:55.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.029) 0:00:55.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.033) 0:00:55.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.033) 0:00:55.107 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.035) 0:00:55.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.036) 0:00:55.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:41:07 +0000 (0:00:00.046) 0:00:55.225 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.417) 0:00:55.643 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.399) 0:00:56.042 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.041) 0:00:56.083 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.036) 0:00:56.120 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.034) 0:00:56.154 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.032) 0:00:56.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.036) 0:00:56.224 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:41:08 +0000 (0:00:00.033) 0:00:56.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.032) 0:00:56.289 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.037) 0:00:56.327 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.036) 0:00:56.364 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.041) 0:00:56.406 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.043682", "end": "2022-06-01 13:41:08.877282", "rc": 0, "start": "2022-06-01 13:41:08.833600" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.460) 0:00:56.867 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.040) 0:00:56.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.042) 0:00:56.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.033) 0:00:56.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.033) 0:00:57.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.035) 0:00:57.053 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.039) 0:00:57.092 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.033) 0:00:57.126 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.034) 0:00:57.160 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.030) 0:00:57.191 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Try to create LVM with a too-large volume size, resize to "12884901888.0"] *** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:88 Wednesday 01 June 2022 17:41:09 +0000 (0:00:00.038) 0:00:57.229 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.054) 0:00:57.284 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.048) 0:00:57.332 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.592) 0:00:57.924 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.074) 0:00:57.999 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.034) 0:00:58.033 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.033) 0:00:58.066 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.068) 0:00:58.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.028) 0:00:58.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.039) 0:00:58.203 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "12884901888.0" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:41:10 +0000 (0:00:00.052) 0:00:58.256 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:41:11 +0000 (0:00:00.036) 0:00:58.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:41:11 +0000 (0:00:00.037) 0:00:58.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:41:11 +0000 (0:00:00.034) 0:00:58.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:41:11 +0000 (0:00:00.034) 0:00:58.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:41:11 +0000 (0:00:00.035) 0:00:58.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:41:11 +0000 (0:00:00.047) 0:00:58.482 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:41:11 +0000 (0:00:00.028) 0:00:58.511 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: volume 'test1' cannot be resized to '12 GiB' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:41:13 +0000 (0:00:01.833) 0:01:00.344 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'vdb'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext4', u'mount_options': u'defaults', u'size': u'12884901888.0', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'vdb'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"volume 'test1' cannot be resized to '12 GiB'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.042) 0:01:00.387 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:106 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.029) 0:01:00.416 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:112 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.037) 0:01:00.454 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM with volume size equal disk's size, resize to "10737418240"] *** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:121 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.043) 0:01:00.498 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.054) 0:01:00.552 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.047) 0:01:00.600 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.556) 0:01:01.157 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.074) 0:01:01.231 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:41:13 +0000 (0:00:00.035) 0:01:01.267 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.033) 0:01:01.301 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.068) 0:01:01.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.029) 0:01:01.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.033) 0:01:01.432 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "10737418240" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.047) 0:01:01.480 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.036) 0:01:01.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.036) 0:01:01.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.034) 0:01:01.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.035) 0:01:01.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.033) 0:01:01.657 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.046) 0:01:01.704 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:41:14 +0000 (0:00:00.028) 0:01:01.732 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "lvm2", "dosfstools", "mdadm" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:41:16 +0000 (0:00:02.270) 0:01:04.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:41:16 +0000 (0:00:00.033) 0:01:04.037 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:41:16 +0000 (0:00:00.032) 0:01:04.069 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "lvm2", "dosfstools", "mdadm" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:41:16 +0000 (0:00:00.043) 0:01:04.113 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:41:16 +0000 (0:00:00.040) 0:01:04.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:41:16 +0000 (0:00:00.041) 0:01:04.194 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:41:16 +0000 (0:00:00.032) 0:01:04.227 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:41:17 +0000 (0:00:00.700) 0:01:04.927 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:41:18 +0000 (0:00:00.434) 0:01:05.361 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:41:18 +0000 (0:00:00.672) 0:01:06.034 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:41:19 +0000 (0:00:00.405) 0:01:06.440 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:41:19 +0000 (0:00:00.030) 0:01:06.471 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:134 Wednesday 01 June 2022 17:41:20 +0000 (0:00:00.883) 0:01:07.355 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:41:20 +0000 (0:00:00.048) 0:01:07.403 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "10737418240", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:41:20 +0000 (0:00:00.042) 0:01:07.446 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:41:20 +0000 (0:00:00.034) 0:01:07.480 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test1", "size": "10G", "type": "lvm", "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "s1Oc6q-whLY-AaH3-36Xr-heSM-6Y5u-k6ISki" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:41:20 +0000 (0:00:00.399) 0:01:07.880 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003296", "end": "2022-06-01 13:41:20.311446", "rc": 0, "start": "2022-06-01 13:41:20.308150" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:41:21 +0000 (0:00:00.417) 0:01:08.298 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003314", "end": "2022-06-01 13:41:20.686174", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:41:20.682860" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:41:21 +0000 (0:00:00.374) 0:01:08.673 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:41:21 +0000 (0:00:00.069) 0:01:08.742 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:41:21 +0000 (0:00:00.032) 0:01:08.775 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:41:21 +0000 (0:00:00.064) 0:01:08.839 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:41:21 +0000 (0:00:00.039) 0:01:08.879 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.412) 0:01:09.292 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.042) 0:01:09.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.038) 0:01:09.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.036) 0:01:09.409 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.040) 0:01:09.450 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.032) 0:01:09.483 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.044) 0:01:09.527 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.060) 0:01:09.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.068) 0:01:09.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.033) 0:01:09.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.037) 0:01:09.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.039) 0:01:09.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.034) 0:01:09.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.039) 0:01:09.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.038) 0:01:09.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.034) 0:01:09.915 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.062) 0:01:09.978 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.065) 0:01:10.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.031) 0:01:10.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.031) 0:01:10.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.037) 0:01:10.143 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.076) 0:01:10.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:41:22 +0000 (0:00:00.037) 0:01:10.257 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.034) 0:01:10.291 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.058) 0:01:10.350 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.038) 0:01:10.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.036) 0:01:10.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.030) 0:01:10.455 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.029) 0:01:10.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.033) 0:01:10.519 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.034) 0:01:10.553 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.035) 0:01:10.588 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.064) 0:01:10.653 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.066) 0:01:10.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.030) 0:01:10.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.033) 0:01:10.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.030) 0:01:10.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.030) 0:01:10.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.033) 0:01:10.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.033) 0:01:10.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.032) 0:01:10.944 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.031) 0:01:10.975 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.030) 0:01:11.006 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.067) 0:01:11.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.039) 0:01:11.113 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.127) 0:01:11.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:41:23 +0000 (0:00:00.036) 0:01:11.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2422023, "block_size": 4096, "block_total": 2562885, "block_used": 140862, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9920606208, "size_total": 10497576960, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2422023, "block_size": 4096, "block_total": 2562885, "block_used": 140862, "device": "/dev/mapper/foo-test1", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9920606208, "size_total": 10497576960, "uuid": "ece4f256-0104-40bc-905a-77fd4620771b" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.047) 0:01:11.324 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.040) 0:01:11.364 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.039) 0:01:11.404 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.038) 0:01:11.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.031) 0:01:11.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.033) 0:01:11.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.033) 0:01:11.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.033) 0:01:11.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.116) 0:01:11.690 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.038) 0:01:11.729 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.040) 0:01:11.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.033) 0:01:11.802 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.038) 0:01:11.841 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.040) 0:01:11.882 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:41:24 +0000 (0:00:00.042) 0:01:11.924 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105275.9391215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105275.9391215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 24314, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105275.9391215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.419) 0:01:12.344 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.043) 0:01:12.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.042) 0:01:12.430 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.040) 0:01:12.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.034) 0:01:12.505 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.039) 0:01:12.545 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.031) 0:01:12.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.031) 0:01:12.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.034) 0:01:12.643 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.039) 0:01:12.683 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.034) 0:01:12.717 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.035) 0:01:12.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.032) 0:01:12.784 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.033) 0:01:12.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.033) 0:01:12.852 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.040) 0:01:12.893 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.038) 0:01:12.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.038) 0:01:12.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.035) 0:01:13.005 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.035) 0:01:13.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.038) 0:01:13.079 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.033) 0:01:13.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.036) 0:01:13.148 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.035) 0:01:13.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.030) 0:01:13.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:41:25 +0000 (0:00:00.030) 0:01:13.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.032) 0:01:13.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.033) 0:01:13.312 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.392) 0:01:13.704 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.415) 0:01:14.119 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "10737418240" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.040) 0:01:14.160 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.039) 0:01:14.199 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.033) 0:01:14.233 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:41:26 +0000 (0:00:00.032) 0:01:14.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.033) 0:01:14.300 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.033) 0:01:14.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.034) 0:01:14.368 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 10737418240, "changed": false, "failed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.040) 0:01:14.408 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.036) 0:01:14.445 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.041) 0:01:14.486 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.039231", "end": "2022-06-01 13:41:26.927867", "rc": 0, "start": "2022-06-01 13:41:26.888636" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.429) 0:01:14.916 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.041) 0:01:14.958 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.042) 0:01:15.000 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.034) 0:01:15.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.035) 0:01:15.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.037) 0:01:15.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.037) 0:01:15.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.043) 0:01:15.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.037) 0:01:15.227 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:41:27 +0000 (0:00:00.032) 0:01:15.259 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Try to create LVM with an invalid size specification, resize to "xyz GiB"] *** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:138 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.035) 0:01:15.295 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.054) 0:01:15.349 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.045) 0:01:15.395 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.550) 0:01:15.945 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.082) 0:01:16.028 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.033) 0:01:16.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.033) 0:01:16.095 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.068) 0:01:16.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.027) 0:01:16.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.033) 0:01:16.225 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "xyz GiB" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:41:28 +0000 (0:00:00.039) 0:01:16.264 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:41:29 +0000 (0:00:00.038) 0:01:16.302 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:41:29 +0000 (0:00:00.031) 0:01:16.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:41:29 +0000 (0:00:00.031) 0:01:16.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:41:29 +0000 (0:00:00.029) 0:01:16.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:41:29 +0000 (0:00:00.031) 0:01:16.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:41:29 +0000 (0:00:00.048) 0:01:16.474 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:41:29 +0000 (0:00:00.034) 0:01:16.509 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: invalid size specification for volume 'test1': 'xyz GiB' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:41:31 +0000 (0:00:01.771) 0:01:18.280 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'vdb'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext4', u'mount_options': u'defaults', u'size': u'xyz GiB', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'vdb'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"invalid size specification for volume 'test1': 'xyz GiB'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.047) 0:01:18.327 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:156 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.030) 0:01:18.358 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:162 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.037) 0:01:18.395 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Try to create LVM with an invalid size specification, resize to "none"] *** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:171 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.039) 0:01:18.435 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.062) 0:01:18.497 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.049) 0:01:18.547 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.631) 0:01:19.178 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:41:31 +0000 (0:00:00.080) 0:01:19.258 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.035) 0:01:19.293 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.034) 0:01:19.328 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.065) 0:01:19.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.027) 0:01:19.421 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.032) 0:01:19.454 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "volumes": [ { "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "size": "none" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.038) 0:01:19.492 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.032) 0:01:19.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.033) 0:01:19.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.030) 0:01:19.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.029) 0:01:19.619 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.032) 0:01:19.652 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.047) 0:01:19.699 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:41:32 +0000 (0:00:00.032) 0:01:19.731 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: invalid size specification for volume 'test1': 'none' TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Wednesday 01 June 2022 17:41:34 +0000 (0:00:01.842) 0:01:21.574 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'raid_level': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'vdb'], u'raid_chunk_size': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'ext4', u'mount_options': u'defaults', u'size': u'none', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'mount_passno': 0, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'cached': False, u'type': u'lvm', u'disks': [u'vdb'], u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'type': u'lvm', u'encryption_cipher': None}], u'volumes': [], u'pool_defaults': {u'encryption_password': None, u'raid_metadata_version': None, u'encryption': False, u'encryption_key_size': None, u'disks': [], u'encryption_key': None, u'encryption_luks_version': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'encryption_cipher': None, u'raid_chunk_size': None, u'type': u'lvm', u'raid_level': None, u'raid_spare_count': None}, u'volume_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'mount_device_identifier': u'uuid', u'mount_passno': 0, u'fs_type': u'xfs', u'mount_options': u'defaults', u'type': u'lvm', u'encryption_luks_version': None, u'raid_spare_count': None, u'size': 0, u'cache_mode': None, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'cached': False, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_overwrite_existing': True, u'encryption_key_size': None, u'fs_create_options': u'', u'deduplication': None}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"invalid size specification for volume 'test1': 'none'"} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:41:34 +0000 (0:00:00.042) 0:01:21.616 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:189 Wednesday 01 June 2022 17:41:34 +0000 (0:00:00.029) 0:01:21.646 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output] ******************************************************* task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:195 Wednesday 01 June 2022 17:41:34 +0000 (0:00:00.038) 0:01:21.684 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:202 Wednesday 01 June 2022 17:41:34 +0000 (0:00:00.040) 0:01:21.725 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:41:34 +0000 (0:00:00.057) 0:01:21.782 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:41:34 +0000 (0:00:00.047) 0:01:21.829 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.551) 0:01:22.381 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.077) 0:01:22.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.034) 0:01:22.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.034) 0:01:22.527 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.077) 0:01:22.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.030) 0:01:22.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.036) 0:01:22.671 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.042) 0:01:22.713 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.040) 0:01:22.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.034) 0:01:22.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.034) 0:01:22.823 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.034) 0:01:22.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.034) 0:01:22.892 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.052) 0:01:22.945 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:41:35 +0000 (0:00:00.088) 0:01:23.034 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "mdadm", "dosfstools" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:41:38 +0000 (0:00:02.375) 0:01:25.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:41:38 +0000 (0:00:00.033) 0:01:25.443 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:41:38 +0000 (0:00:00.030) 0:01:25.474 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "xfsprogs", "mdadm", "dosfstools" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:41:38 +0000 (0:00:00.049) 0:01:25.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:41:38 +0000 (0:00:00.040) 0:01:25.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:41:38 +0000 (0:00:00.037) 0:01:25.602 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:41:38 +0000 (0:00:00.401) 0:01:26.004 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:41:39 +0000 (0:00:00.703) 0:01:26.708 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:41:39 +0000 (0:00:00.033) 0:01:26.741 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:41:40 +0000 (0:00:00.694) 0:01:27.436 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:41:40 +0000 (0:00:00.405) 0:01:27.841 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:41:40 +0000 (0:00:00.031) 0:01:27.873 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:215 Wednesday 01 June 2022 17:41:41 +0000 (0:00:00.879) 0:01:28.753 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:41:41 +0000 (0:00:00.052) 0:01:28.806 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:41:41 +0000 (0:00:00.038) 0:01:28.844 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:41:41 +0000 (0:00:00.033) 0:01:28.878 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:41:42 +0000 (0:00:00.419) 0:01:29.297 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002956", "end": "2022-06-01 13:41:41.699646", "rc": 0, "start": "2022-06-01 13:41:41.696690" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:41:42 +0000 (0:00:00.388) 0:01:29.685 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002936", "end": "2022-06-01 13:41:42.098678", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:41:42.095742" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:41:42 +0000 (0:00:00.399) 0:01:30.084 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:41:42 +0000 (0:00:00.071) 0:01:30.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:41:42 +0000 (0:00:00.107) 0:01:30.264 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.068) 0:01:30.332 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.042) 0:01:30.375 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.029) 0:01:30.405 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.033) 0:01:30.438 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.041) 0:01:30.479 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.038) 0:01:30.518 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.034) 0:01:30.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.030) 0:01:30.584 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.031) 0:01:30.615 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.062) 0:01:30.677 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.034) 0:01:30.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.033) 0:01:30.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.033) 0:01:30.778 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.035) 0:01:30.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.033) 0:01:30.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.033) 0:01:30.880 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.033) 0:01:30.914 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.033) 0:01:30.948 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.064) 0:01:31.012 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.064) 0:01:31.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.032) 0:01:31.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.034) 0:01:31.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.032) 0:01:31.178 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:41:43 +0000 (0:00:00.065) 0:01:31.244 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.037) 0:01:31.281 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.031) 0:01:31.313 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.033) 0:01:31.346 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.032) 0:01:31.379 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.064) 0:01:31.443 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.084) 0:01:31.527 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.038) 0:01:31.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.034) 0:01:31.600 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.032) 0:01:31.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.030) 0:01:31.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.033) 0:01:31.696 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.032) 0:01:31.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.032) 0:01:31.761 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.031) 0:01:31.793 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.033) 0:01:31.826 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.064) 0:01:31.891 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.036) 0:01:31.927 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.125) 0:01:32.052 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.039) 0:01:32.092 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.042) 0:01:32.135 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:41:44 +0000 (0:00:00.102) 0:01:32.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.039) 0:01:32.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.033) 0:01:32.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.033) 0:01:32.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.034) 0:01:32.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.031) 0:01:32.411 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.035) 0:01:32.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.048) 0:01:32.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.028) 0:01:32.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.043) 0:01:32.566 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.033) 0:01:32.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.034) 0:01:32.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.035) 0:01:32.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.026) 0:01:32.696 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.399) 0:01:33.095 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.042) 0:01:33.137 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.027) 0:01:33.165 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.040) 0:01:33.206 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.032) 0:01:33.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:41:45 +0000 (0:00:00.033) 0:01:33.272 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.040) 0:01:33.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.033) 0:01:33.346 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.032) 0:01:33.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.028) 0:01:33.407 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.030) 0:01:33.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.031) 0:01:33.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.030) 0:01:33.500 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.034) 0:01:33.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.031) 0:01:33.566 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.040) 0:01:33.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.040) 0:01:33.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.035) 0:01:33.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.032) 0:01:33.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.033) 0:01:33.748 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.035) 0:01:33.783 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.034) 0:01:33.817 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.035) 0:01:33.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.037) 0:01:33.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.042) 0:01:33.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.035) 0:01:33.969 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.034) 0:01:34.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.037) 0:01:34.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.033) 0:01:34.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.035) 0:01:34.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.032) 0:01:34.143 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.038) 0:01:34.181 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.034) 0:01:34.216 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:41:46 +0000 (0:00:00.034) 0:01:34.250 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.034) 0:01:34.285 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.037) 0:01:34.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.037) 0:01:34.360 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.037) 0:01:34.398 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "10737418240" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.036) 0:01:34.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.030) 0:01:34.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.030) 0:01:34.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.033) 0:01:34.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.030) 0:01:34.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.029) 0:01:34.588 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.032) 0:01:34.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.032) 0:01:34.653 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.033) 0:01:34.687 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.036) 0:01:34.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.031) 0:01:34.755 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.027) 0:01:34.783 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create a LVM logical volume with "5g" for ext3 FS] *********************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:219 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.031) 0:01:34.814 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.119) 0:01:34.934 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:41:47 +0000 (0:00:00.045) 0:01:34.980 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.569) 0:01:35.550 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.078) 0:01:35.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.033) 0:01:35.661 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.034) 0:01:35.696 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.065) 0:01:35.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.025) 0:01:35.787 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.031) 0:01:35.818 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext3", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.037) 0:01:35.856 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.033) 0:01:35.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.029) 0:01:35.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.033) 0:01:35.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.030) 0:01:35.982 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.030) 0:01:36.013 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.043) 0:01:36.057 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:41:48 +0000 (0:00:00.031) 0:01:36.088 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "lvm2", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:41:51 +0000 (0:00:02.414) 0:01:38.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:41:51 +0000 (0:00:00.033) 0:01:38.537 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:41:51 +0000 (0:00:00.029) 0:01:38.566 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "dosfstools", "e2fsprogs", "lvm2", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:41:51 +0000 (0:00:00.042) 0:01:38.609 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:41:51 +0000 (0:00:00.045) 0:01:38.654 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:41:51 +0000 (0:00:00.037) 0:01:38.691 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:41:51 +0000 (0:00:00.029) 0:01:38.721 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:41:52 +0000 (0:00:00.927) 0:01:39.649 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:41:52 +0000 (0:00:00.427) 0:01:40.077 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:41:53 +0000 (0:00:00.709) 0:01:40.786 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:41:53 +0000 (0:00:00.408) 0:01:41.194 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:41:54 +0000 (0:00:00.084) 0:01:41.279 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:233 Wednesday 01 June 2022 17:41:55 +0000 (0:00:01.233) 0:01:42.513 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:41:55 +0000 (0:00:00.061) 0:01:42.575 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:41:55 +0000 (0:00:00.047) 0:01:42.622 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:41:55 +0000 (0:00:00.032) 0:01:42.655 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext3", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "QAM9Vn-TaSt-GynO-bz4T-Dir0-tdm1-zNY6B4" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:41:55 +0000 (0:00:00.427) 0:01:43.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002880", "end": "2022-06-01 13:41:55.490316", "rc": 0, "start": "2022-06-01 13:41:55.487436" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext3 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:41:56 +0000 (0:00:00.394) 0:01:43.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002934", "end": "2022-06-01 13:41:55.872755", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:41:55.869821" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:41:56 +0000 (0:00:00.385) 0:01:43.862 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:41:56 +0000 (0:00:00.090) 0:01:43.952 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:41:56 +0000 (0:00:00.034) 0:01:43.987 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:41:56 +0000 (0:00:00.067) 0:01:44.054 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:41:56 +0000 (0:00:00.043) 0:01:44.098 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.411) 0:01:44.509 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.044) 0:01:44.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.041) 0:01:44.595 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.038) 0:01:44.633 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.040) 0:01:44.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.033) 0:01:44.707 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.046) 0:01:44.753 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.061) 0:01:44.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.033) 0:01:44.848 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.035) 0:01:44.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.051) 0:01:44.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.034) 0:01:44.970 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.033) 0:01:45.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.033) 0:01:45.036 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.034) 0:01:45.070 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.038) 0:01:45.109 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:41:57 +0000 (0:00:00.078) 0:01:45.187 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.132) 0:01:45.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.031) 0:01:45.351 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.029) 0:01:45.380 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.031) 0:01:45.411 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.061) 0:01:45.473 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.034) 0:01:45.507 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.037) 0:01:45.544 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.063) 0:01:45.607 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.037) 0:01:45.645 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.038) 0:01:45.684 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.033) 0:01:45.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.032) 0:01:45.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.037) 0:01:45.787 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.034) 0:01:45.822 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.033) 0:01:45.856 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.065) 0:01:45.922 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.074) 0:01:45.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.033) 0:01:46.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.034) 0:01:46.064 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.032) 0:01:46.097 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.035) 0:01:46.133 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.032) 0:01:46.165 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.031) 0:01:46.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.031) 0:01:46.228 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:41:58 +0000 (0:00:00.032) 0:01:46.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.030) 0:01:46.292 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.063) 0:01:46.356 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.037) 0:01:46.393 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.126) 0:01:46.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.047) 0:01:46.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1205649, "block_size": 4096, "block_total": 1271208, "block_used": 65559, "device": "/dev/mapper/foo-test1", "fstype": "ext3", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4938338304, "size_total": 5206867968, "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1205649, "block_size": 4096, "block_total": 1271208, "block_used": 65559, "device": "/dev/mapper/foo-test1", "fstype": "ext3", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4938338304, "size_total": 5206867968, "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.051) 0:01:46.619 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.039) 0:01:46.659 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.039) 0:01:46.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.038) 0:01:46.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.031) 0:01:46.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.030) 0:01:46.799 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.031) 0:01:46.831 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.032) 0:01:46.863 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext3 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.047) 0:01:46.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.037) 0:01:46.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.035) 0:01:46.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.031) 0:01:47.015 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.030) 0:01:47.045 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.036) 0:01:47.081 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:41:59 +0000 (0:00:00.040) 0:01:47.123 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105310.4441216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105310.4441216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 25061, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105310.4441216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.400) 0:01:47.523 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.041) 0:01:47.565 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.040) 0:01:47.605 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.036) 0:01:47.642 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.033) 0:01:47.675 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.036) 0:01:47.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.035) 0:01:47.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.032) 0:01:47.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.032) 0:01:47.812 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.039) 0:01:47.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.035) 0:01:47.887 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.032) 0:01:47.919 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.034) 0:01:47.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.034) 0:01:47.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.034) 0:01:48.022 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.039) 0:01:48.062 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.042) 0:01:48.105 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.034) 0:01:48.139 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.034) 0:01:48.174 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.031) 0:01:48.205 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.033) 0:01:48.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:42:00 +0000 (0:00:00.034) 0:01:48.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:42:01 +0000 (0:00:00.050) 0:01:48.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:42:01 +0000 (0:00:00.041) 0:01:48.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:42:01 +0000 (0:00:00.035) 0:01:48.400 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:42:01 +0000 (0:00:00.037) 0:01:48.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:42:01 +0000 (0:00:00.034) 0:01:48.472 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:42:01 +0000 (0:00:00.034) 0:01:48.507 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:42:01 +0000 (0:00:00.402) 0:01:48.910 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.402) 0:01:49.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.044) 0:01:49.356 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.039) 0:01:49.396 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.035) 0:01:49.432 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.034) 0:01:49.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.034) 0:01:49.501 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.033) 0:01:49.534 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.040) 0:01:49.575 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.043) 0:01:49.618 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.036) 0:01:49.655 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.049) 0:01:49.705 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.031287", "end": "2022-06-01 13:42:02.173091", "rc": 0, "start": "2022-06-01 13:42:02.141804" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.462) 0:01:50.167 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.048) 0:01:50.216 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:42:02 +0000 (0:00:00.046) 0:01:50.262 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.036) 0:01:50.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.039) 0:01:50.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.037) 0:01:50.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.035) 0:01:50.411 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.033) 0:01:50.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.036) 0:01:50.481 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.098) 0:01:50.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change volume size to "9g"] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:235 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.036) 0:01:50.617 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.074) 0:01:50.691 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:42:03 +0000 (0:00:00.052) 0:01:50.744 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.551) 0:01:51.296 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.081) 0:01:51.377 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.038) 0:01:51.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.036) 0:01:51.452 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.068) 0:01:51.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.029) 0:01:51.549 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.036) 0:01:51.586 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext3", "mount_point": "/opt/test1", "name": "test1", "size": "9g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.042) 0:01:51.628 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.036) 0:01:51.665 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.032) 0:01:51.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.032) 0:01:51.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.035) 0:01:51.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.037) 0:01:51.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.051) 0:01:51.855 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:42:04 +0000 (0:00:00.031) 0:01:51.887 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "mdadm", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:42:07 +0000 (0:00:02.745) 0:01:54.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:42:07 +0000 (0:00:00.034) 0:01:54.668 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:42:07 +0000 (0:00:00.033) 0:01:54.701 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "mdadm", "lvm2", "dosfstools" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:42:07 +0000 (0:00:00.043) 0:01:54.744 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:42:07 +0000 (0:00:00.039) 0:01:54.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:42:07 +0000 (0:00:00.036) 0:01:54.820 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:42:07 +0000 (0:00:00.033) 0:01:54.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:42:08 +0000 (0:00:00.753) 0:01:55.608 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:42:08 +0000 (0:00:00.434) 0:01:56.042 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:42:09 +0000 (0:00:00.670) 0:01:56.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:42:09 +0000 (0:00:00.463) 0:01:57.176 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:42:09 +0000 (0:00:00.040) 0:01:57.217 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:249 Wednesday 01 June 2022 17:42:10 +0000 (0:00:00.918) 0:01:58.136 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:42:10 +0000 (0:00:00.066) 0:01:58.203 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:42:10 +0000 (0:00:00.045) 0:01:58.248 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:42:11 +0000 (0:00:00.034) 0:01:58.283 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext3", "label": "", "name": "/dev/mapper/foo-test1", "size": "9G", "type": "lvm", "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "QAM9Vn-TaSt-GynO-bz4T-Dir0-tdm1-zNY6B4" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:42:11 +0000 (0:00:00.426) 0:01:58.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002967", "end": "2022-06-01 13:42:11.130884", "rc": 0, "start": "2022-06-01 13:42:11.127917" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext3 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:42:11 +0000 (0:00:00.410) 0:01:59.120 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002902", "end": "2022-06-01 13:42:11.527459", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:42:11.524557" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.395) 0:01:59.516 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.070) 0:01:59.587 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.035) 0:01:59.622 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.069) 0:01:59.692 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.043) 0:01:59.735 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.437) 0:02:00.173 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.049) 0:02:00.222 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:42:12 +0000 (0:00:00.043) 0:02:00.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.037) 0:02:00.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.039) 0:02:00.343 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.032) 0:02:00.375 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.046) 0:02:00.421 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.058) 0:02:00.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.097) 0:02:00.578 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.035) 0:02:00.613 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.033) 0:02:00.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.034) 0:02:00.682 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.035) 0:02:00.718 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.034) 0:02:00.752 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.037) 0:02:00.789 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.039) 0:02:00.828 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.067) 0:02:00.896 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.066) 0:02:00.963 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.034) 0:02:00.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.032) 0:02:01.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.031) 0:02:01.062 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.069) 0:02:01.131 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.041) 0:02:01.173 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.039) 0:02:01.212 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:42:13 +0000 (0:00:00.064) 0:02:01.277 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.042) 0:02:01.320 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.042) 0:02:01.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.033) 0:02:01.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.034) 0:02:01.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.034) 0:02:01.465 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.034) 0:02:01.499 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.037) 0:02:01.537 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.072) 0:02:01.609 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.070) 0:02:01.680 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.032) 0:02:01.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.031) 0:02:01.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.031) 0:02:01.775 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.030) 0:02:01.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.032) 0:02:01.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.035) 0:02:01.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.031) 0:02:01.906 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.032) 0:02:01.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.033) 0:02:01.972 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.063) 0:02:02.035 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.037) 0:02:02.073 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.129) 0:02:02.202 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:42:14 +0000 (0:00:00.039) 0:02:02.241 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2185028, "block_size": 4096, "block_total": 2305886, "block_used": 120858, "device": "/dev/mapper/foo-test1", "fstype": "ext3", "inode_available": 589813, "inode_total": 589824, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 8949874688, "size_total": 9444909056, "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2185028, "block_size": 4096, "block_total": 2305886, "block_used": 120858, "device": "/dev/mapper/foo-test1", "fstype": "ext3", "inode_available": 589813, "inode_total": 589824, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 8949874688, "size_total": 9444909056, "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.046) 0:02:02.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.038) 0:02:02.326 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.038) 0:02:02.364 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.038) 0:02:02.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.031) 0:02:02.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.032) 0:02:02.467 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.032) 0:02:02.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.039) 0:02:02.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext3 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.122) 0:02:02.662 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.036) 0:02:02.699 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.039) 0:02:02.738 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.031) 0:02:02.769 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.031) 0:02:02.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.036) 0:02:02.838 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:42:15 +0000 (0:00:00.040) 0:02:02.879 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105326.5691216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105326.5691216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 25061, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105326.5691216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.417) 0:02:03.296 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.037) 0:02:03.334 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.037) 0:02:03.371 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.036) 0:02:03.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.031) 0:02:03.439 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.038) 0:02:03.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.030) 0:02:03.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.030) 0:02:03.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.031) 0:02:03.571 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.038) 0:02:03.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.034) 0:02:03.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.037) 0:02:03.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.033) 0:02:03.715 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.035) 0:02:03.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.034) 0:02:03.785 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.041) 0:02:03.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.040) 0:02:03.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.038) 0:02:03.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.038) 0:02:03.945 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.035) 0:02:03.981 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.034) 0:02:04.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.035) 0:02:04.051 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.037) 0:02:04.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.039) 0:02:04.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.036) 0:02:04.164 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.031) 0:02:04.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.034) 0:02:04.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:42:16 +0000 (0:00:00.031) 0:02:04.262 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 9663676416, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:42:17 +0000 (0:00:00.402) 0:02:04.664 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 9663676416, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:42:17 +0000 (0:00:00.393) 0:02:05.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "9663676416" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:42:17 +0000 (0:00:00.039) 0:02:05.098 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "9663676416" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:42:17 +0000 (0:00:00.040) 0:02:05.138 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:42:17 +0000 (0:00:00.041) 0:02:05.180 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:42:17 +0000 (0:00:00.049) 0:02:05.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:42:17 +0000 (0:00:00.037) 0:02:05.268 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.037) 0:02:05.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.035) 0:02:05.341 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 9663676416, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.039) 0:02:05.380 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "9663676416" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.038) 0:02:05.418 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.042) 0:02:05.461 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.043835", "end": "2022-06-01 13:42:17.910045", "rc": 0, "start": "2022-06-01 13:42:17.866210" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.438) 0:02:05.900 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.043) 0:02:05.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.042) 0:02:05.986 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.036) 0:02:06.023 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.094) 0:02:06.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.036) 0:02:06.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.037) 0:02:06.191 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.034) 0:02:06.226 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:42:18 +0000 (0:00:00.031) 0:02:06.257 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.031) 0:02:06.289 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change volume size to "5g"] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:251 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.038) 0:02:06.327 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.073) 0:02:06.401 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.054) 0:02:06.455 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.530) 0:02:06.986 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.083) 0:02:07.070 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.031) 0:02:07.102 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.036) 0:02:07.138 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.066) 0:02:07.204 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.028) 0:02:07.233 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:42:19 +0000 (0:00:00.034) 0:02:07.267 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext3", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.041) 0:02:07.309 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.035) 0:02:07.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.032) 0:02:07.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.032) 0:02:07.410 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.031) 0:02:07.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.034) 0:02:07.475 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.061) 0:02:07.537 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:42:20 +0000 (0:00:00.034) 0:02:07.571 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" }, { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:42:23 +0000 (0:00:02.719) 0:02:10.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:42:23 +0000 (0:00:00.035) 0:02:10.326 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:42:23 +0000 (0:00:00.032) 0:02:10.359 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" }, { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:42:23 +0000 (0:00:00.042) 0:02:10.402 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:42:23 +0000 (0:00:00.043) 0:02:10.445 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:42:23 +0000 (0:00:00.038) 0:02:10.484 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:42:23 +0000 (0:00:00.032) 0:02:10.516 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:42:23 +0000 (0:00:00.676) 0:02:11.193 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:42:24 +0000 (0:00:00.443) 0:02:11.636 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:42:25 +0000 (0:00:00.704) 0:02:12.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:42:25 +0000 (0:00:00.430) 0:02:12.771 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:42:25 +0000 (0:00:00.030) 0:02:12.802 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:265 Wednesday 01 June 2022 17:42:26 +0000 (0:00:00.905) 0:02:13.707 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:42:26 +0000 (0:00:00.066) 0:02:13.773 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:42:26 +0000 (0:00:00.046) 0:02:13.820 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:42:26 +0000 (0:00:00.033) 0:02:13.854 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext3", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "QAM9Vn-TaSt-GynO-bz4T-Dir0-tdm1-zNY6B4" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:42:26 +0000 (0:00:00.418) 0:02:14.273 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003021", "end": "2022-06-01 13:42:26.690147", "rc": 0, "start": "2022-06-01 13:42:26.687126" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext3 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:42:27 +0000 (0:00:00.408) 0:02:14.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002885", "end": "2022-06-01 13:42:27.089419", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:42:27.086534" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:42:27 +0000 (0:00:00.398) 0:02:15.080 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:42:27 +0000 (0:00:00.081) 0:02:15.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:42:27 +0000 (0:00:00.046) 0:02:15.208 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.071) 0:02:15.280 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.046) 0:02:15.326 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.391) 0:02:15.717 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.101) 0:02:15.819 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.043) 0:02:15.862 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.039) 0:02:15.902 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.038) 0:02:15.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.031) 0:02:15.972 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.047) 0:02:16.020 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.057) 0:02:16.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.032) 0:02:16.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.034) 0:02:16.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.031) 0:02:16.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.033) 0:02:16.210 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:42:28 +0000 (0:00:00.034) 0:02:16.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.034) 0:02:16.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.032) 0:02:16.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.037) 0:02:16.349 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.059) 0:02:16.409 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.061) 0:02:16.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.035) 0:02:16.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.034) 0:02:16.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.033) 0:02:16.574 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.064) 0:02:16.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.040) 0:02:16.679 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.037) 0:02:16.717 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.066) 0:02:16.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.038) 0:02:16.823 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.038) 0:02:16.861 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.032) 0:02:16.893 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.031) 0:02:16.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.032) 0:02:16.957 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.033) 0:02:16.991 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.032) 0:02:17.023 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.066) 0:02:17.090 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.072) 0:02:17.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.034) 0:02:17.196 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.036) 0:02:17.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:42:29 +0000 (0:00:00.034) 0:02:17.267 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.033) 0:02:17.300 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.033) 0:02:17.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.034) 0:02:17.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.033) 0:02:17.401 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.041) 0:02:17.443 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.037) 0:02:17.480 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.067) 0:02:17.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.040) 0:02:17.589 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.183) 0:02:17.772 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.040) 0:02:17.812 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1205650, "block_size": 4096, "block_total": 1273760, "block_used": 68110, "device": "/dev/mapper/foo-test1", "fstype": "ext3", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4938342400, "size_total": 5217320960, "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1205650, "block_size": 4096, "block_total": 1273760, "block_used": 68110, "device": "/dev/mapper/foo-test1", "fstype": "ext3", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 4938342400, "size_total": 5217320960, "uuid": "634d4ad0-43ed-4c80-b728-ccfc43861076" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.044) 0:02:17.857 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.038) 0:02:17.896 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.040) 0:02:17.936 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.044) 0:02:17.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.031) 0:02:18.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.031) 0:02:18.043 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.031) 0:02:18.074 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.034) 0:02:18.108 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext3 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.049) 0:02:18.158 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.041) 0:02:18.199 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.039) 0:02:18.239 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:42:30 +0000 (0:00:00.033) 0:02:18.273 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.032) 0:02:18.306 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.041) 0:02:18.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.037) 0:02:18.385 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105342.2071216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105342.2071216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 25061, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105342.2071216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.404) 0:02:18.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.042) 0:02:18.831 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.041) 0:02:18.872 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.035) 0:02:18.908 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.033) 0:02:18.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.040) 0:02:18.981 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.033) 0:02:19.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.035) 0:02:19.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.033) 0:02:19.083 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.041) 0:02:19.125 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.035) 0:02:19.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.035) 0:02:19.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.034) 0:02:19.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:42:31 +0000 (0:00:00.036) 0:02:19.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.034) 0:02:19.301 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.041) 0:02:19.343 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.039) 0:02:19.382 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.034) 0:02:19.416 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.033) 0:02:19.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.037) 0:02:19.486 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.037) 0:02:19.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.035) 0:02:19.559 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.036) 0:02:19.595 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.034) 0:02:19.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.033) 0:02:19.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.038) 0:02:19.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.034) 0:02:19.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.034) 0:02:19.771 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:42:32 +0000 (0:00:00.420) 0:02:20.191 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.396) 0:02:20.587 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.041) 0:02:20.629 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.037) 0:02:20.667 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.032) 0:02:20.700 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.034) 0:02:20.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.032) 0:02:20.766 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.033) 0:02:20.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.032) 0:02:20.832 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.036) 0:02:20.869 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.036) 0:02:20.905 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:42:33 +0000 (0:00:00.045) 0:02:20.951 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030274", "end": "2022-06-01 13:42:33.392847", "rc": 0, "start": "2022-06-01 13:42:33.362573" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.434) 0:02:21.385 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.040) 0:02:21.426 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.044) 0:02:21.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.035) 0:02:21.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.037) 0:02:21.544 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.037) 0:02:21.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.081) 0:02:21.663 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.033) 0:02:21.696 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.031) 0:02:21.728 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.030) 0:02:21.758 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:267 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.031) 0:02:21.790 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.081) 0:02:21.872 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:42:34 +0000 (0:00:00.058) 0:02:21.930 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.575) 0:02:22.506 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.085) 0:02:22.591 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.036) 0:02:22.628 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.035) 0:02:22.664 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.071) 0:02:22.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.030) 0:02:22.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.035) 0:02:22.801 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.043) 0:02:22.845 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.041) 0:02:22.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.037) 0:02:22.923 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.035) 0:02:22.958 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.034) 0:02:22.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.031) 0:02:23.024 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.048) 0:02:23.072 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:42:35 +0000 (0:00:00.032) 0:02:23.104 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext3", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:42:38 +0000 (0:00:02.543) 0:02:25.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:42:38 +0000 (0:00:00.033) 0:02:25.682 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:42:38 +0000 (0:00:00.030) 0:02:25.712 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext3" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext3", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:42:38 +0000 (0:00:00.040) 0:02:25.753 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:42:38 +0000 (0:00:00.043) 0:02:25.797 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:42:38 +0000 (0:00:00.036) 0:02:25.834 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'ext3', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "fstype": "ext3", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:42:38 +0000 (0:00:00.435) 0:02:26.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:42:39 +0000 (0:00:00.690) 0:02:26.960 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:42:39 +0000 (0:00:00.035) 0:02:26.996 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:42:40 +0000 (0:00:00.693) 0:02:27.690 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:42:40 +0000 (0:00:00.424) 0:02:28.114 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:42:40 +0000 (0:00:00.030) 0:02:28.145 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:280 Wednesday 01 June 2022 17:42:41 +0000 (0:00:00.912) 0:02:29.058 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:42:41 +0000 (0:00:00.068) 0:02:29.126 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:42:41 +0000 (0:00:00.046) 0:02:29.173 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:42:41 +0000 (0:00:00.035) 0:02:29.208 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:42:42 +0000 (0:00:00.405) 0:02:29.614 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003143", "end": "2022-06-01 13:42:42.037218", "rc": 0, "start": "2022-06-01 13:42:42.034075" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:42:42 +0000 (0:00:00.413) 0:02:30.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002885", "end": "2022-06-01 13:42:42.448681", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:42:42.445796" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.410) 0:02:30.438 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.066) 0:02:30.505 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.031) 0:02:30.536 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.064) 0:02:30.600 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.039) 0:02:30.640 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.027) 0:02:30.667 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.029) 0:02:30.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.038) 0:02:30.735 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.039) 0:02:30.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.037) 0:02:30.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.032) 0:02:30.845 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.029) 0:02:30.874 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.058) 0:02:30.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.031) 0:02:30.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.033) 0:02:30.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.030) 0:02:31.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.030) 0:02:31.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.033) 0:02:31.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.033) 0:02:31.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.081) 0:02:31.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:42:43 +0000 (0:00:00.035) 0:02:31.244 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.060) 0:02:31.304 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.063) 0:02:31.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.031) 0:02:31.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.033) 0:02:31.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.033) 0:02:31.466 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.068) 0:02:31.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.040) 0:02:31.574 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.030) 0:02:31.605 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.031) 0:02:31.637 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.034) 0:02:31.671 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.067) 0:02:31.739 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.068) 0:02:31.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.032) 0:02:31.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.036) 0:02:31.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.034) 0:02:31.910 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.031) 0:02:31.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.031) 0:02:31.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.034) 0:02:32.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.037) 0:02:32.045 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.046) 0:02:32.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.034) 0:02:32.126 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.065) 0:02:32.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:42:44 +0000 (0:00:00.038) 0:02:32.231 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.129) 0:02:32.361 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.037) 0:02:32.398 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.050) 0:02:32.449 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.034) 0:02:32.483 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.043) 0:02:32.526 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.040) 0:02:32.567 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.035) 0:02:32.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.033) 0:02:32.636 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.032) 0:02:32.668 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.034) 0:02:32.703 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.050) 0:02:32.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.030) 0:02:32.785 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.040) 0:02:32.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.033) 0:02:32.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.034) 0:02:32.894 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.032) 0:02:32.926 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:42:45 +0000 (0:00:00.028) 0:02:32.954 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.387) 0:02:33.342 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.099) 0:02:33.441 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.028) 0:02:33.470 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.037) 0:02:33.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.032) 0:02:33.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.028) 0:02:33.568 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.033) 0:02:33.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.036) 0:02:33.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.032) 0:02:33.670 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.028) 0:02:33.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.030) 0:02:33.729 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.032) 0:02:33.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.032) 0:02:33.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.037) 0:02:33.832 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.032) 0:02:33.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.047) 0:02:33.912 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.043) 0:02:33.955 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.032) 0:02:33.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.034) 0:02:34.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.037) 0:02:34.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.034) 0:02:34.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.033) 0:02:34.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.032) 0:02:34.161 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.036) 0:02:34.197 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.033) 0:02:34.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:42:46 +0000 (0:00:00.036) 0:02:34.268 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.034) 0:02:34.303 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.034) 0:02:34.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:34.371 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:34.405 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.036) 0:02:34.442 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.040) 0:02:34.483 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.035) 0:02:34.518 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.035) 0:02:34.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:34.587 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.034) 0:02:34.621 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.034) 0:02:34.655 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.041) 0:02:34.696 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.034) 0:02:34.730 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.031) 0:02:34.762 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:34.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:34.829 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:34.862 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.036) 0:02:34.899 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.034) 0:02:34.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:34.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:35.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.035) 0:02:35.037 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.033) 0:02:35.070 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.036) 0:02:35.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Create a LVM logical volume with "5g" for ext2 FS] *********************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:284 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.037) 0:02:35.145 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:42:47 +0000 (0:00:00.110) 0:02:35.255 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.047) 0:02:35.302 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.551) 0:02:35.854 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.076) 0:02:35.930 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.032) 0:02:35.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.033) 0:02:35.996 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.117) 0:02:36.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.028) 0:02:36.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.033) 0:02:36.176 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext2", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.040) 0:02:36.216 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:42:48 +0000 (0:00:00.038) 0:02:36.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:42:49 +0000 (0:00:00.033) 0:02:36.288 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:42:49 +0000 (0:00:00.036) 0:02:36.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:42:49 +0000 (0:00:00.033) 0:02:36.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:42:49 +0000 (0:00:00.032) 0:02:36.391 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:42:49 +0000 (0:00:00.047) 0:02:36.438 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:42:49 +0000 (0:00:00.030) 0:02:36.469 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:42:51 +0000 (0:00:02.342) 0:02:38.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:42:51 +0000 (0:00:00.035) 0:02:38.848 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:42:51 +0000 (0:00:00.033) 0:02:38.881 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdc", "/dev/vdd", "/dev/mapper/foo-test1" ], "mounts": [ { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "e2fsprogs", "dosfstools", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:42:51 +0000 (0:00:00.047) 0:02:38.929 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:42:51 +0000 (0:00:00.044) 0:02:38.973 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:42:51 +0000 (0:00:00.039) 0:02:39.012 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:42:51 +0000 (0:00:00.030) 0:02:39.043 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:42:52 +0000 (0:00:00.687) 0:02:39.731 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext2", "mount_info": { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:42:52 +0000 (0:00:00.460) 0:02:40.191 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:42:53 +0000 (0:00:00.667) 0:02:40.858 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:42:53 +0000 (0:00:00.409) 0:02:41.267 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:42:54 +0000 (0:00:00.033) 0:02:41.301 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:298 Wednesday 01 June 2022 17:42:54 +0000 (0:00:00.892) 0:02:42.194 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:42:54 +0000 (0:00:00.076) 0:02:42.271 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:42:55 +0000 (0:00:00.043) 0:02:42.314 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:42:55 +0000 (0:00:00.032) 0:02:42.347 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext2", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "nyAJQq-9UAx-QPs2-ECbR-Ebh9-1xr7-fD7f52" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:42:55 +0000 (0:00:00.402) 0:02:42.750 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003287", "end": "2022-06-01 13:42:55.200342", "rc": 0, "start": "2022-06-01 13:42:55.197055" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext2 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:42:55 +0000 (0:00:00.444) 0:02:43.195 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003176", "end": "2022-06-01 13:42:55.605481", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:42:55.602305" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:42:56 +0000 (0:00:00.401) 0:02:43.596 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:42:56 +0000 (0:00:00.067) 0:02:43.664 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:42:56 +0000 (0:00:00.034) 0:02:43.699 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:42:56 +0000 (0:00:00.064) 0:02:43.764 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:42:56 +0000 (0:00:00.045) 0:02:43.809 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:42:56 +0000 (0:00:00.405) 0:02:44.215 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:42:56 +0000 (0:00:00.046) 0:02:44.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.041) 0:02:44.303 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.043) 0:02:44.346 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.040) 0:02:44.386 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.034) 0:02:44.420 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.049) 0:02:44.470 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.065) 0:02:44.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.036) 0:02:44.571 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.031) 0:02:44.603 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.031) 0:02:44.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.031) 0:02:44.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.033) 0:02:44.700 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.031) 0:02:44.731 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.033) 0:02:44.765 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.032) 0:02:44.797 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.062) 0:02:44.859 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.065) 0:02:44.925 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.032) 0:02:44.957 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.031) 0:02:44.988 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.033) 0:02:45.021 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.061) 0:02:45.083 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.037) 0:02:45.121 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.039) 0:02:45.160 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.069) 0:02:45.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:42:57 +0000 (0:00:00.039) 0:02:45.268 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.037) 0:02:45.306 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.031) 0:02:45.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.029) 0:02:45.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.035) 0:02:45.403 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.033) 0:02:45.437 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.035) 0:02:45.472 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.067) 0:02:45.539 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.123) 0:02:45.663 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.034) 0:02:45.697 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.050) 0:02:45.747 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.048) 0:02:45.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.046) 0:02:45.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.036) 0:02:45.879 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.033) 0:02:45.912 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.049) 0:02:45.962 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.034) 0:02:45.997 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.034) 0:02:46.031 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.066) 0:02:46.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.039) 0:02:46.137 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:42:58 +0000 (0:00:00.137) 0:02:46.274 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.041) 0:02:46.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1222050, "block_size": 4096, "block_total": 1287592, "block_used": 65542, "device": "/dev/mapper/foo-test1", "fstype": "ext2", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 5005516800, "size_total": 5273976832, "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1222050, "block_size": 4096, "block_total": 1287592, "block_used": 65542, "device": "/dev/mapper/foo-test1", "fstype": "ext2", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 5005516800, "size_total": 5273976832, "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.048) 0:02:46.365 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.042) 0:02:46.407 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.039) 0:02:46.447 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.042) 0:02:46.489 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.031) 0:02:46.521 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.031) 0:02:46.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.036) 0:02:46.590 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.032) 0:02:46.622 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext2 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.047) 0:02:46.670 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.037) 0:02:46.708 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.037) 0:02:46.745 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.029) 0:02:46.775 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.034) 0:02:46.810 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.042) 0:02:46.852 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:42:59 +0000 (0:00:00.039) 0:02:46.892 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105370.7541215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105370.7541215, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 25488, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105370.7541215, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.429) 0:02:47.321 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.039) 0:02:47.360 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.038) 0:02:47.399 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.037) 0:02:47.436 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.034) 0:02:47.471 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.037) 0:02:47.508 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.032) 0:02:47.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.033) 0:02:47.573 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.031) 0:02:47.605 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.043) 0:02:47.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.030) 0:02:47.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.029) 0:02:47.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.031) 0:02:47.741 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.030) 0:02:47.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.031) 0:02:47.803 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.044) 0:02:47.847 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.041) 0:02:47.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.031) 0:02:47.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.030) 0:02:47.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.036) 0:02:47.988 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.034) 0:02:48.022 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.038) 0:02:48.061 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.090) 0:02:48.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.032) 0:02:48.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.031) 0:02:48.215 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:43:00 +0000 (0:00:00.031) 0:02:48.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.033) 0:02:48.280 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.041) 0:02:48.322 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.404) 0:02:48.727 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.397) 0:02:49.125 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.039) 0:02:49.164 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.035) 0:02:49.200 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.032) 0:02:49.232 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:43:01 +0000 (0:00:00.034) 0:02:49.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.035) 0:02:49.302 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.034) 0:02:49.337 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.036) 0:02:49.374 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.035) 0:02:49.410 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.033) 0:02:49.443 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.038) 0:02:49.481 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.048221", "end": "2022-06-01 13:43:01.937106", "rc": 0, "start": "2022-06-01 13:43:01.888885" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.457) 0:02:49.939 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.044) 0:02:49.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.047) 0:02:50.031 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.038) 0:02:50.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.036) 0:02:50.106 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.038) 0:02:50.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.036) 0:02:50.181 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.036) 0:02:50.217 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:43:02 +0000 (0:00:00.035) 0:02:50.253 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.031) 0:02:50.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change volume size to "9g"] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:300 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.033) 0:02:50.318 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.107) 0:02:50.425 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.048) 0:02:50.473 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.548) 0:02:51.022 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.072) 0:02:51.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.034) 0:02:51.130 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.032) 0:02:51.162 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.065) 0:02:51.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:43:03 +0000 (0:00:00.030) 0:02:51.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.034) 0:02:51.292 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext2", "mount_point": "/opt/test1", "name": "test1", "size": "9g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.040) 0:02:51.332 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.036) 0:02:51.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.101) 0:02:51.470 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.034) 0:02:51.505 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.034) 0:02:51.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.034) 0:02:51.574 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.048) 0:02:51.622 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:43:04 +0000 (0:00:00.032) 0:02:51.654 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs", "mdadm", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:43:06 +0000 (0:00:02.561) 0:02:54.216 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:43:06 +0000 (0:00:00.034) 0:02:54.250 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:43:07 +0000 (0:00:00.030) 0:02:54.281 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "dosfstools", "xfsprogs", "mdadm", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:43:07 +0000 (0:00:00.042) 0:02:54.323 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:43:07 +0000 (0:00:00.042) 0:02:54.365 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:43:07 +0000 (0:00:00.039) 0:02:54.405 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:43:07 +0000 (0:00:00.034) 0:02:54.439 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:43:07 +0000 (0:00:00.749) 0:02:55.188 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext2", "mount_info": { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:43:08 +0000 (0:00:00.434) 0:02:55.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:43:09 +0000 (0:00:00.722) 0:02:56.346 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:43:09 +0000 (0:00:00.416) 0:02:56.762 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:43:09 +0000 (0:00:00.030) 0:02:56.792 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:314 Wednesday 01 June 2022 17:43:10 +0000 (0:00:00.905) 0:02:57.697 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:43:10 +0000 (0:00:00.078) 0:02:57.776 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "9g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:43:10 +0000 (0:00:00.043) 0:02:57.820 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:43:10 +0000 (0:00:00.031) 0:02:57.851 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext2", "label": "", "name": "/dev/mapper/foo-test1", "size": "9G", "type": "lvm", "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "nyAJQq-9UAx-QPs2-ECbR-Ebh9-1xr7-fD7f52" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:43:10 +0000 (0:00:00.395) 0:02:58.247 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002750", "end": "2022-06-01 13:43:10.649662", "rc": 0, "start": "2022-06-01 13:43:10.646912" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext2 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:43:11 +0000 (0:00:00.391) 0:02:58.638 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002883", "end": "2022-06-01 13:43:11.043035", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:43:11.040152" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:43:11 +0000 (0:00:00.415) 0:02:59.054 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:43:11 +0000 (0:00:00.069) 0:02:59.124 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:43:11 +0000 (0:00:00.034) 0:02:59.159 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:43:11 +0000 (0:00:00.070) 0:02:59.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:43:11 +0000 (0:00:00.043) 0:02:59.273 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.403) 0:02:59.676 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.043) 0:02:59.720 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.039) 0:02:59.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.040) 0:02:59.800 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.045) 0:02:59.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.035) 0:02:59.882 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.045) 0:02:59.928 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.074) 0:03:00.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.036) 0:03:00.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.035) 0:03:00.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.033) 0:03:00.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.034) 0:03:00.142 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.040) 0:03:00.183 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.042) 0:03:00.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:43:12 +0000 (0:00:00.035) 0:03:00.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.036) 0:03:00.297 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.065) 0:03:00.362 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.072) 0:03:00.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.034) 0:03:00.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.034) 0:03:00.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.033) 0:03:00.537 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.068) 0:03:00.606 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.039) 0:03:00.645 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.039) 0:03:00.684 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.060) 0:03:00.745 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.037) 0:03:00.783 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.041) 0:03:00.825 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.034) 0:03:00.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.032) 0:03:00.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.031) 0:03:00.922 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.036) 0:03:00.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.033) 0:03:00.992 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.066) 0:03:01.059 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.068) 0:03:01.127 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.032) 0:03:01.160 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.031) 0:03:01.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.032) 0:03:01.224 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:43:13 +0000 (0:00:00.031) 0:03:01.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.036) 0:03:01.292 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.086) 0:03:01.378 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.035) 0:03:01.414 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.034) 0:03:01.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.034) 0:03:01.482 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.066) 0:03:01.549 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.039) 0:03:01.589 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.131) 0:03:01.720 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.039) 0:03:01.759 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2201429, "block_size": 4096, "block_total": 2322270, "block_used": 120841, "device": "/dev/mapper/foo-test1", "fstype": "ext2", "inode_available": 589813, "inode_total": 589824, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9017053184, "size_total": 9512017920, "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2201429, "block_size": 4096, "block_total": 2322270, "block_used": 120841, "device": "/dev/mapper/foo-test1", "fstype": "ext2", "inode_available": 589813, "inode_total": 589824, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9017053184, "size_total": 9512017920, "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.046) 0:03:01.806 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.043) 0:03:01.849 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.039) 0:03:01.889 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.040) 0:03:01.929 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.031) 0:03:01.960 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.032) 0:03:01.993 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.035) 0:03:02.028 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.039) 0:03:02.068 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext2 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.049) 0:03:02.118 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.034) 0:03:02.152 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.037) 0:03:02.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.033) 0:03:02.223 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:43:14 +0000 (0:00:00.033) 0:03:02.257 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.043) 0:03:02.300 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.038) 0:03:02.338 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105386.1571214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105386.1571214, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 25488, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105386.1571214, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.424) 0:03:02.763 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.041) 0:03:02.804 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.040) 0:03:02.845 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.037) 0:03:02.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.040) 0:03:02.923 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.040) 0:03:02.964 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.033) 0:03:02.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.031) 0:03:03.029 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.039) 0:03:03.069 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.044) 0:03:03.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.037) 0:03:03.151 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.036) 0:03:03.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.034) 0:03:03.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:43:15 +0000 (0:00:00.032) 0:03:03.254 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.031) 0:03:03.285 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.042) 0:03:03.328 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.041) 0:03:03.369 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.031) 0:03:03.401 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.033) 0:03:03.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.032) 0:03:03.466 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.032) 0:03:03.499 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.031) 0:03:03.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.034) 0:03:03.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.033) 0:03:03.599 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.035) 0:03:03.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.034) 0:03:03.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.034) 0:03:03.703 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.034) 0:03:03.737 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 9663676416, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:43:16 +0000 (0:00:00.435) 0:03:04.172 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 9663676416, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.402) 0:03:04.575 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "9663676416" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.041) 0:03:04.616 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "9663676416" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.035) 0:03:04.652 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.033) 0:03:04.686 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.033) 0:03:04.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.037) 0:03:04.757 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.037) 0:03:04.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.034) 0:03:04.829 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 9663676416, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.043) 0:03:04.873 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "9663676416" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.037) 0:03:04.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:43:17 +0000 (0:00:00.042) 0:03:04.953 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.039779", "end": "2022-06-01 13:43:17.402028", "rc": 0, "start": "2022-06-01 13:43:17.362249" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.439) 0:03:05.393 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.039) 0:03:05.433 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.040) 0:03:05.473 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.033) 0:03:05.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.032) 0:03:05.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.041) 0:03:05.580 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.033) 0:03:05.614 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.032) 0:03:05.647 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.034) 0:03:05.681 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.033) 0:03:05.715 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change volume size to "5g"] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:316 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.033) 0:03:05.748 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.107) 0:03:05.855 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:43:18 +0000 (0:00:00.046) 0:03:05.902 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.553) 0:03:06.455 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.079) 0:03:06.534 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.032) 0:03:06.567 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.030) 0:03:06.598 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.064) 0:03:06.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.027) 0:03:06.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.033) 0:03:06.723 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "type": "lvm", "volumes": [ { "fs_type": "ext2", "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.038) 0:03:06.762 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.033) 0:03:06.796 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.030) 0:03:06.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.095) 0:03:06.922 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.032) 0:03:06.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.033) 0:03:06.987 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.047) 0:03:07.035 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:43:19 +0000 (0:00:00.031) 0:03:07.067 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" }, { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:43:22 +0000 (0:00:02.733) 0:03:09.800 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:43:22 +0000 (0:00:00.039) 0:03:09.840 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:43:22 +0000 (0:00:00.034) 0:03:09.875 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" }, { "action": "resize device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/mapper/foo-test1", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools", "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:43:22 +0000 (0:00:00.047) 0:03:09.923 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:43:22 +0000 (0:00:00.041) 0:03:09.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:43:22 +0000 (0:00:00.037) 0:03:10.003 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:43:22 +0000 (0:00:00.030) 0:03:10.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:43:23 +0000 (0:00:00.723) 0:03:10.757 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext2'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext2", "mount_info": { "dump": 0, "fstype": "ext2", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:43:23 +0000 (0:00:00.428) 0:03:11.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:43:24 +0000 (0:00:00.699) 0:03:11.885 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:43:25 +0000 (0:00:00.401) 0:03:12.287 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:43:25 +0000 (0:00:00.030) 0:03:12.318 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:330 Wednesday 01 June 2022 17:43:25 +0000 (0:00:00.931) 0:03:13.250 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:43:26 +0000 (0:00:00.091) 0:03:13.341 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:43:26 +0000 (0:00:00.043) 0:03:13.384 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:43:26 +0000 (0:00:00.030) 0:03:13.415 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "ext2", "label": "", "name": "/dev/mapper/foo-test1", "size": "5G", "type": "lvm", "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" }, "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "LVM2_member", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "nyAJQq-9UAx-QPs2-ECbR-Ebh9-1xr7-fD7f52" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:43:26 +0000 (0:00:00.418) 0:03:13.834 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002953", "end": "2022-06-01 13:43:26.247205", "rc": 0, "start": "2022-06-01 13:43:26.244252" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 /dev/mapper/foo-test1 /opt/test1 ext2 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:43:26 +0000 (0:00:00.408) 0:03:14.243 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003441", "end": "2022-06-01 13:43:26.666886", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:43:26.663445" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:43:27 +0000 (0:00:00.416) 0:03:14.659 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:43:27 +0000 (0:00:00.124) 0:03:14.784 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:43:27 +0000 (0:00:00.033) 0:03:14.817 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:43:27 +0000 (0:00:00.071) 0:03:14.889 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [ "/dev/vdb" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:43:27 +0000 (0:00:00.048) 0:03:14.938 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/vdb", "pv": "/dev/vdb" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.418) 0:03:15.357 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/vdb" ] }, "ansible_index_var": "idx", "ansible_loop_var": "item", "changed": false, "idx": 0, "item": "/dev/vdb" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.045) 0:03:15.402 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.047) 0:03:15.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.043) 0:03:15.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.042) 0:03:15.535 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.035) 0:03:15.571 ******** ok: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/vdb" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.044) 0:03:15.616 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.058) 0:03:15.674 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.032) 0:03:15.707 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.036) 0:03:15.744 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.032) 0:03:15.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.035) 0:03:15.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.032) 0:03:15.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.034) 0:03:15.878 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.036) 0:03:15.915 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.038) 0:03:15.953 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.060) 0:03:16.014 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.063) 0:03:16.077 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.030) 0:03:16.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.029) 0:03:16.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.030) 0:03:16.169 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.063) 0:03:16.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:43:28 +0000 (0:00:00.043) 0:03:16.276 ******** skipping: [/cache/rhel-x.qcow2] => (item=/dev/vdb) => { "_storage_test_pool_member_path": "/dev/vdb", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.040) 0:03:16.316 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:1 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.059) 0:03:16.376 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:6 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.037) 0:03:16.413 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:11 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.042) 0:03:16.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:17 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.035) 0:03:16.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:23 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.032) 0:03:16.524 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-crypttab.yml:29 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.034) 0:03:16.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.033) 0:03:16.592 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.033) 0:03:16.626 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.069) 0:03:16.695 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.070) 0:03:16.766 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.035) 0:03:16.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.032) 0:03:16.834 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.033) 0:03:16.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.033) 0:03:16.901 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.032) 0:03:16.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.032) 0:03:16.967 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.035) 0:03:17.003 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.106) 0:03:17.109 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.041) 0:03:17.151 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.063) 0:03:17.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:43:29 +0000 (0:00:00.039) 0:03:17.254 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.131) 0:03:17.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.042) 0:03:17.428 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1222051, "block_size": 4096, "block_total": 1290144, "block_used": 68093, "device": "/dev/mapper/foo-test1", "fstype": "ext2", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 5005520896, "size_total": 5284429824, "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1222051, "block_size": 4096, "block_total": 1290144, "block_used": 68093, "device": "/dev/mapper/foo-test1", "fstype": "ext2", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 5005520896, "size_total": 5284429824, "uuid": "03310f12-8356-4d20-aca4-e86d75e29e60" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.048) 0:03:17.476 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.042) 0:03:17.519 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.040) 0:03:17.560 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.045) 0:03:17.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.032) 0:03:17.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.034) 0:03:17.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.031) 0:03:17.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.032) 0:03:17.736 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext2 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.048) 0:03:17.785 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.043) 0:03:17.829 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.044) 0:03:17.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.035) 0:03:17.909 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.038) 0:03:17.948 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.042) 0:03:17.990 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:43:30 +0000 (0:00:00.042) 0:03:18.033 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105401.7111216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105401.7111216, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 25488, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1654105401.7111216, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.415) 0:03:18.448 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.038) 0:03:18.487 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.036) 0:03:18.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.036) 0:03:18.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.033) 0:03:18.594 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.039) 0:03:18.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.036) 0:03:18.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.034) 0:03:18.704 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.035) 0:03:18.739 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.043) 0:03:18.782 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.035) 0:03:18.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.037) 0:03:18.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.039) 0:03:18.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.034) 0:03:18.930 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.034) 0:03:18.965 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.042) 0:03:19.008 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.040) 0:03:19.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.033) 0:03:19.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.036) 0:03:19.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.033) 0:03:19.152 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.033) 0:03:19.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.032) 0:03:19.217 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:43:31 +0000 (0:00:00.033) 0:03:19.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:43:32 +0000 (0:00:00.034) 0:03:19.286 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:43:32 +0000 (0:00:00.037) 0:03:19.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:43:32 +0000 (0:00:00.034) 0:03:19.358 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:43:32 +0000 (0:00:00.035) 0:03:19.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:43:32 +0000 (0:00:00.034) 0:03:19.429 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:43:32 +0000 (0:00:00.429) 0:03:19.858 ******** ok: [/cache/rhel-x.qcow2] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.431) 0:03:20.290 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_expected_size": "5368709120" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.040) 0:03:20.331 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.036) 0:03:20.367 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.031) 0:03:20.398 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.033) 0:03:20.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.031) 0:03:20.463 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.031) 0:03:20.494 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.033) 0:03:20.528 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.038) 0:03:20.567 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.037) 0:03:20.604 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.048) 0:03:20.653 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.043929", "end": "2022-06-01 13:43:33.115213", "rc": 0, "start": "2022-06-01 13:43:33.071284" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.456) 0:03:21.109 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.041) 0:03:21.151 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.043) 0:03:21.195 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.037) 0:03:21.232 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:43:33 +0000 (0:00:00.037) 0:03:21.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.040) 0:03:21.310 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.038) 0:03:21.349 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.033) 0:03:21.383 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.030) 0:03:21.414 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.037) 0:03:21.451 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:332 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.039) 0:03:21.491 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.125) 0:03:21.616 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.049) 0:03:21.665 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:43:34 +0000 (0:00:00.573) 0:03:22.239 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.076) 0:03:22.316 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.036) 0:03:22.352 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.033) 0:03:22.386 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.065) 0:03:22.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.029) 0:03:22.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.033) 0:03:22.514 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": [ { "disks": [ "vdb" ], "name": "foo", "state": "absent", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "5g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.044) 0:03:22.559 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.038) 0:03:22.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.034) 0:03:22.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.034) 0:03:22.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.103) 0:03:22.771 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.035) 0:03:22.807 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.051) 0:03:22.858 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:43:35 +0000 (0:00:00.033) 0:03:22.891 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext2", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:43:38 +0000 (0:00:02.484) 0:03:25.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:43:38 +0000 (0:00:00.032) 0:03:25.409 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:43:38 +0000 (0:00:00.031) 0:03:25.440 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "ext2" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext2", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" } ], "packages": [ "dosfstools", "mdadm", "xfsprogs" ], "pools": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:43:38 +0000 (0:00:00.041) 0:03:25.482 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:43:38 +0000 (0:00:00.043) 0:03:25.525 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:43:38 +0000 (0:00:00.036) 0:03:25.562 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'ext2', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext2", "mount_info": { "fstype": "ext2", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:43:38 +0000 (0:00:00.421) 0:03:25.983 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:43:39 +0000 (0:00:00.694) 0:03:26.678 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:43:39 +0000 (0:00:00.034) 0:03:26.713 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:43:40 +0000 (0:00:00.688) 0:03:27.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:43:40 +0000 (0:00:00.397) 0:03:27.798 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:43:40 +0000 (0:00:00.032) 0:03:27.831 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:345 Wednesday 01 June 2022 17:43:41 +0000 (0:00:00.844) 0:03:28.675 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:43:41 +0000 (0:00:00.088) 0:03:28.763 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_pools_list": [ { "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext2", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "5g", "state": "present", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:43:41 +0000 (0:00:00.044) 0:03:28.808 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:43:41 +0000 (0:00:00.032) 0:03:28.841 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:43:41 +0000 (0:00:00.399) 0:03:29.241 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002870", "end": "2022-06-01 13:43:41.635515", "rc": 0, "start": "2022-06-01 13:43:41.632645" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:43:42 +0000 (0:00:00.386) 0:03:29.627 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002849", "end": "2022-06-01 13:43:42.040507", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:43:42.037658" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:43:42 +0000 (0:00:00.404) 0:03:30.032 ******** [WARNING]: The loop variable 'storage_test_pool' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-pool.yml for /cache/rhel-x.qcow2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:5 Wednesday 01 June 2022 17:43:42 +0000 (0:00:00.065) 0:03:30.098 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool.yml:18 Wednesday 01 June 2022 17:43:42 +0000 (0:00:00.101) 0:03:30.199 ******** included: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:1 Wednesday 01 June 2022 17:43:42 +0000 (0:00:00.069) 0:03:30.268 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:11 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.049) 0:03:30.318 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:20 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.035) 0:03:30.354 ******** TASK [Verify PV count] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:27 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.031) 0:03:30.386 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:34 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.040) 0:03:30.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:38 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.037) 0:03:30.464 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:42 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.036) 0:03:30.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:46 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.033) 0:03:30.535 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:56 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.032) 0:03:30.567 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-md.yml for /cache/rhel-x.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:6 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.057) 0:03:30.624 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:12 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.033) 0:03:30.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:16 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.036) 0:03:30.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:20 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.033) 0:03:30.728 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:24 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.033) 0:03:30.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:30 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.033) 0:03:30.795 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:36 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.032) 0:03:30.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-md.yml:44 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.031) 0:03:30.859 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:59 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.032) 0:03:30.891 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-lvmraid.yml:1 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.064) 0:03:30.956 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml for /cache/rhel-x.qcow2 TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:3 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.073) 0:03:31.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:8 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.038) 0:03:31.069 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-lvmraid.yml:12 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.041) 0:03:31.110 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:62 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.033) 0:03:31.144 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:4 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.064) 0:03:31.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:8 Wednesday 01 June 2022 17:43:43 +0000 (0:00:00.039) 0:03:31.247 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:15 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.034) 0:03:31.282 ******** TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-encryption.yml:22 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.029) 0:03:31.312 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:65 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.034) 0:03:31.346 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml for /cache/rhel-x.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-members-vdo.yml:1 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.064) 0:03:31.411 ******** included: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml for /cache/rhel-x.qcow2 TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:3 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.068) 0:03:31.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:8 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.039) 0:03:31.518 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:11 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.035) 0:03:31.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:16 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.035) 0:03:31.589 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:21 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.036) 0:03:31.625 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:24 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.032) 0:03:31.658 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:29 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.031) 0:03:31.690 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/verify-pool-member-vdo.yml:39 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.030) 0:03:31.720 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-members.yml:68 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.031) 0:03:31.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-pool-volumes.yml:3 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.032) 0:03:31.784 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.061) 0:03:31.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.037) 0:03:31.883 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.124) 0:03:32.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.039) 0:03:32.048 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.046) 0:03:32.094 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.100) 0:03:32.194 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.040) 0:03:32.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:43:44 +0000 (0:00:00.034) 0:03:32.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.041) 0:03:32.311 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.033) 0:03:32.345 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.034) 0:03:32.379 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.038) 0:03:32.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.051) 0:03:32.469 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.028) 0:03:32.497 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.042) 0:03:32.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.035) 0:03:32.576 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.036) 0:03:32.612 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.031) 0:03:32.644 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.028) 0:03:32.672 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.399) 0:03:33.071 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.040) 0:03:33.112 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.032) 0:03:33.145 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.041) 0:03:33.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.040) 0:03:33.227 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:43:45 +0000 (0:00:00.029) 0:03:33.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.038) 0:03:33.295 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.035) 0:03:33.331 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.033) 0:03:33.364 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.029) 0:03:33.394 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.035) 0:03:33.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.031) 0:03:33.461 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.034) 0:03:33.495 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.036) 0:03:33.531 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.033) 0:03:33.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.040) 0:03:33.606 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.040) 0:03:33.647 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.031) 0:03:33.678 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.035) 0:03:33.714 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.037) 0:03:33.752 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.034) 0:03:33.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.032) 0:03:33.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.037) 0:03:33.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.035) 0:03:33.891 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.036) 0:03:33.928 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.033) 0:03:33.962 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.034) 0:03:33.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.036) 0:03:34.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.040) 0:03:34.073 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.036) 0:03:34.109 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.034) 0:03:34.143 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.037) 0:03:34.181 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.033) 0:03:34.214 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:43:46 +0000 (0:00:00.032) 0:03:34.246 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.034) 0:03:34.280 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.033) 0:03:34.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.035) 0:03:34.349 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.039) 0:03:34.389 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "5368709120" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.042) 0:03:34.432 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.034) 0:03:34.466 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.037) 0:03:34.504 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.035) 0:03:34.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.034) 0:03:34.574 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.031) 0:03:34.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.031) 0:03:34.638 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.031) 0:03:34.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.034) 0:03:34.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.031) 0:03:34.735 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.033) 0:03:34.769 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.032) 0:03:34.801 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1203 changed=26 unreachable=0 failed=3 skipped=956 rescued=3 ignored=0 Wednesday 01 June 2022 17:43:47 +0000 (0:00:00.018) 0:03:34.819 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.75s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.73s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.48s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.41s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.38s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.34s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.32s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.30s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.28s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.27s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.77s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : Update facts ------------------------------- 1.23s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure blivet is available -------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:43:48 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:43:49 +0000 (0:00:01.359) 0:00:01.382 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_resize_nvme_generated.yml ************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_resize_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:43:49 +0000 (0:00:00.041) 0:00:01.423 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:43:50 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:43:51 +0000 (0:00:01.369) 0:00:01.393 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_resize_scsi_generated.yml ************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_resize_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_resize_scsi_generated.yml:3 Wednesday 01 June 2022 17:43:51 +0000 (0:00:00.038) 0:00:01.431 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_resize_scsi_generated.yml:7 Wednesday 01 June 2022 17:43:53 +0000 (0:00:01.306) 0:00:02.738 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:1 Wednesday 01 June 2022 17:43:53 +0000 (0:00:00.029) 0:00:02.767 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:20 Wednesday 01 June 2022 17:43:54 +0000 (0:00:00.817) 0:00:03.585 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:43:54 +0000 (0:00:00.052) 0:00:03.638 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:43:54 +0000 (0:00:00.157) 0:00:03.795 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:43:54 +0000 (0:00:00.552) 0:00:04.348 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:43:54 +0000 (0:00:00.078) 0:00:04.426 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:43:54 +0000 (0:00:00.023) 0:00:04.450 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:43:54 +0000 (0:00:00.023) 0:00:04.473 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:43:55 +0000 (0:00:00.197) 0:00:04.671 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:43:55 +0000 (0:00:00.021) 0:00:04.693 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:43:56 +0000 (0:00:01.197) 0:00:05.890 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:43:56 +0000 (0:00:00.059) 0:00:05.950 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:43:56 +0000 (0:00:00.054) 0:00:06.004 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:43:57 +0000 (0:00:00.705) 0:00:06.710 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:43:57 +0000 (0:00:00.123) 0:00:06.834 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:43:57 +0000 (0:00:00.023) 0:00:06.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:43:57 +0000 (0:00:00.024) 0:00:06.882 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:43:57 +0000 (0:00:00.022) 0:00:06.905 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:43:58 +0000 (0:00:00.913) 0:00:07.818 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:44:00 +0000 (0:00:01.839) 0:00:09.658 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.045) 0:00:09.703 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.029) 0:00:09.732 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.537) 0:00:10.270 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.029) 0:00:10.299 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.026) 0:00:10.325 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.032) 0:00:10.358 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.035) 0:00:10.394 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.034) 0:00:10.428 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.028) 0:00:10.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:44:00 +0000 (0:00:00.031) 0:00:10.487 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:44:01 +0000 (0:00:00.032) 0:00:10.520 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:44:01 +0000 (0:00:00.031) 0:00:10.552 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:44:01 +0000 (0:00:00.526) 0:00:11.078 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:44:01 +0000 (0:00:00.030) 0:00:11.108 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:23 Wednesday 01 June 2022 17:44:02 +0000 (0:00:00.880) 0:00:11.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_resize.yml:30 Wednesday 01 June 2022 17:44:02 +0000 (0:00:00.031) 0:00:12.020 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:44:02 +0000 (0:00:00.051) 0:00:12.071 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": "Unable to find unused disk" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:44:03 +0000 (0:00:00.600) 0:00:12.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:44:03 +0000 (0:00:00.032) 0:00:12.704 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: Unable to find enough unused disks. Exiting playbook. PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=27 changed=0 unreachable=0 failed=1 skipped=13 rescued=0 ignored=0 Wednesday 01 June 2022 17:44:03 +0000 (0:00:00.021) 0:00:12.726 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.31s /tmp/tmp7247_7fr/tests/tests_resize_scsi_generated.yml:3 ---------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : make sure required packages are installed --- 0.91s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp7247_7fr/tests/tests_resize.yml:1 ------------------------------------- linux-system-roles.storage : get required packages ---------------------- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Find unused disks in the system ----------------------------------------- 0.60s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.54s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.12s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : show storage_pools ------------------------- 0.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 include_role : linux-system-roles.storage ------------------------------- 0.05s /tmp/tmp7247_7fr/tests/tests_resize.yml:20 ------------------------------------ include_tasks ----------------------------------------------------------- 0.05s /tmp/tmp7247_7fr/tests/tests_resize.yml:30 ------------------------------------ ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:44:03 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:44:05 +0000 (0:00:01.369) 0:00:01.393 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_safe_mode_check.yml ******************************************** 1 plays in /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:2 Wednesday 01 June 2022 17:44:05 +0000 (0:00:00.021) 0:00:01.415 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:15 Wednesday 01 June 2022 17:44:06 +0000 (0:00:01.112) 0:00:02.527 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:44:06 +0000 (0:00:00.039) 0:00:02.566 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:44:06 +0000 (0:00:00.159) 0:00:02.726 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:44:07 +0000 (0:00:00.557) 0:00:03.284 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:44:07 +0000 (0:00:00.077) 0:00:03.362 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:44:07 +0000 (0:00:00.024) 0:00:03.386 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:44:07 +0000 (0:00:00.023) 0:00:03.410 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:44:07 +0000 (0:00:00.205) 0:00:03.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:44:07 +0000 (0:00:00.019) 0:00:03.635 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:44:08 +0000 (0:00:01.113) 0:00:04.748 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:44:08 +0000 (0:00:00.048) 0:00:04.796 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:44:08 +0000 (0:00:00.046) 0:00:04.843 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:44:09 +0000 (0:00:00.702) 0:00:05.545 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:44:09 +0000 (0:00:00.081) 0:00:05.627 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:44:09 +0000 (0:00:00.023) 0:00:05.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:44:09 +0000 (0:00:00.024) 0:00:05.675 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:44:09 +0000 (0:00:00.022) 0:00:05.697 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:44:10 +0000 (0:00:00.843) 0:00:06.541 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:44:12 +0000 (0:00:01.885) 0:00:08.427 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:44:12 +0000 (0:00:00.042) 0:00:08.470 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:44:12 +0000 (0:00:00.027) 0:00:08.498 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.594) 0:00:09.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.031) 0:00:09.123 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.028) 0:00:09.151 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.031) 0:00:09.183 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.031) 0:00:09.215 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.033) 0:00:09.248 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.027) 0:00:09.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.029) 0:00:09.305 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.027) 0:00:09.333 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.028) 0:00:09.362 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.490) 0:00:09.853 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:44:13 +0000 (0:00:00.029) 0:00:09.882 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:18 Wednesday 01 June 2022 17:44:14 +0000 (0:00:00.866) 0:00:10.749 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:25 Wednesday 01 June 2022 17:44:14 +0000 (0:00:00.031) 0:00:10.781 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:44:14 +0000 (0:00:00.046) 0:00:10.827 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "vdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:44:15 +0000 (0:00:00.544) 0:00:11.372 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "vdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:44:15 +0000 (0:00:00.037) 0:00:11.409 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:44:15 +0000 (0:00:00.031) 0:00:11.441 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "vdb" ] } TASK [Install package] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:32 Wednesday 01 June 2022 17:44:15 +0000 (0:00:00.034) 0:00:11.475 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false, "failures": [ "No package nilfs-utils available." ], "rc": 1, "results": [] } MSG: Failed to install some of the specified packages TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:38 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.930) 0:00:12.406 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_skip_rest": true }, "changed": false } TASK [Create nilfs2 partition (1/2)] ******************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:42 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.031) 0:00:12.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create nilfs2 partition (1/2)] ******************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:46 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.030) 0:00:12.468 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [create nilfs2 partition (2/2)] ******************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:50 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.029) 0:00:12.497 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create two LVM logical volumes under volume group 'foo'] ***************** task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:56 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.031) 0:00:12.528 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [unreachable task] ******************************************************** task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:69 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.031) 0:00:12.560 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Cleanup - remove nilfs2 partition] *************************************** task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:79 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.033) 0:00:12.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:87 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.062) 0:00:12.655 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_skip_rest": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=29 changed=0 unreachable=0 failed=0 skipped=19 rescued=1 ignored=0 Wednesday 01 June 2022 17:44:16 +0000 (0:00:00.015) 0:00:12.671 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.37s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:2 ---------------------------- Install package --------------------------------------------------------- 0.93s /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:32 --------------------------- linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.59s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.56s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- Find unused disks in the system ----------------------------------------- 0.54s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.49s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.21s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- Cleanup - remove nilfs2 partition --------------------------------------- 0.06s /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:79 --------------------------- linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 include_tasks ----------------------------------------------------------- 0.05s /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:25 --------------------------- linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:44:17 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:44:18 +0000 (0:00:01.321) 0:00:01.344 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_safe_mode_check_nvme_generated.yml ***************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_safe_mode_check_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:44:18 +0000 (0:00:00.018) 0:00:01.362 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:44:19 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:44:20 +0000 (0:00:01.374) 0:00:01.398 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_safe_mode_check_scsi_generated.yml ***************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_safe_mode_check_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check_scsi_generated.yml:3 Wednesday 01 June 2022 17:44:20 +0000 (0:00:00.015) 0:00:01.414 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check_scsi_generated.yml:7 Wednesday 01 June 2022 17:44:22 +0000 (0:00:01.131) 0:00:02.545 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:2 Wednesday 01 June 2022 17:44:22 +0000 (0:00:00.028) 0:00:02.573 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:15 Wednesday 01 June 2022 17:44:22 +0000 (0:00:00.884) 0:00:03.458 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:44:22 +0000 (0:00:00.039) 0:00:03.498 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:44:23 +0000 (0:00:00.163) 0:00:03.661 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:44:23 +0000 (0:00:00.548) 0:00:04.209 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:44:23 +0000 (0:00:00.079) 0:00:04.289 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:44:23 +0000 (0:00:00.024) 0:00:04.314 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:44:23 +0000 (0:00:00.023) 0:00:04.337 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:44:24 +0000 (0:00:00.202) 0:00:04.539 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:44:24 +0000 (0:00:00.021) 0:00:04.560 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:44:25 +0000 (0:00:01.069) 0:00:05.630 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:44:25 +0000 (0:00:00.048) 0:00:05.678 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:44:25 +0000 (0:00:00.048) 0:00:05.727 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:44:25 +0000 (0:00:00.717) 0:00:06.445 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:44:25 +0000 (0:00:00.084) 0:00:06.529 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:44:26 +0000 (0:00:00.023) 0:00:06.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:44:26 +0000 (0:00:00.024) 0:00:06.578 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:44:26 +0000 (0:00:00.022) 0:00:06.600 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:44:26 +0000 (0:00:00.848) 0:00:07.449 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:44:28 +0000 (0:00:01.868) 0:00:09.317 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:44:28 +0000 (0:00:00.042) 0:00:09.360 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:44:28 +0000 (0:00:00.026) 0:00:09.386 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.549) 0:00:09.936 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.030) 0:00:09.966 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.027) 0:00:09.994 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.037) 0:00:10.031 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.034) 0:00:10.065 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.034) 0:00:10.100 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.028) 0:00:10.129 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.029) 0:00:10.158 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.028) 0:00:10.187 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:44:29 +0000 (0:00:00.033) 0:00:10.221 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:44:30 +0000 (0:00:00.533) 0:00:10.754 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:44:30 +0000 (0:00:00.029) 0:00:10.783 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:18 Wednesday 01 June 2022 17:44:31 +0000 (0:00:00.880) 0:00:11.664 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:25 Wednesday 01 June 2022 17:44:31 +0000 (0:00:00.065) 0:00:11.730 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:44:31 +0000 (0:00:00.046) 0:00:11.776 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": "Unable to find unused disk" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:44:31 +0000 (0:00:00.517) 0:00:12.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:44:31 +0000 (0:00:00.029) 0:00:12.323 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: Unable to find enough unused disks. Exiting playbook. PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=27 changed=0 unreachable=0 failed=1 skipped=13 rescued=0 ignored=0 Wednesday 01 June 2022 17:44:31 +0000 (0:00:00.020) 0:00:12.343 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.38s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.13s /tmp/tmp7247_7fr/tests/tests_safe_mode_check_scsi_generated.yml:3 ------------- linux-system-roles.storage : make sure blivet is available -------------- 1.07s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 0.88s /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:2 ---------------------------- linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Find unused disks in the system ----------------------------------------- 0.52s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : include the appropriate provider tasks ----- 0.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- Mark tasks to be skipped ------------------------------------------------ 0.07s /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:18 --------------------------- linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 include_tasks ----------------------------------------------------------- 0.05s /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml:25 --------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:44:32 +0000 (0:00:00.026) 0:00:00.026 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:44:33 +0000 (0:00:01.406) 0:00:01.433 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.41s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_swap.yml ******************************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_swap.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:2 Wednesday 01 June 2022 17:44:34 +0000 (0:00:00.018) 0:00:01.451 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:10 Wednesday 01 June 2022 17:44:35 +0000 (0:00:01.131) 0:00:02.583 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:44:35 +0000 (0:00:00.042) 0:00:02.626 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:44:35 +0000 (0:00:00.167) 0:00:02.793 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:44:35 +0000 (0:00:00.590) 0:00:03.384 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:44:36 +0000 (0:00:00.078) 0:00:03.463 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:44:36 +0000 (0:00:00.025) 0:00:03.488 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:44:36 +0000 (0:00:00.026) 0:00:03.514 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:44:36 +0000 (0:00:00.198) 0:00:03.712 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:44:36 +0000 (0:00:00.020) 0:00:03.733 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:44:37 +0000 (0:00:01.105) 0:00:04.839 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:44:37 +0000 (0:00:00.050) 0:00:04.889 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:44:37 +0000 (0:00:00.049) 0:00:04.938 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:44:38 +0000 (0:00:00.722) 0:00:05.660 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:44:38 +0000 (0:00:00.086) 0:00:05.747 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:44:38 +0000 (0:00:00.021) 0:00:05.769 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:44:38 +0000 (0:00:00.025) 0:00:05.794 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:44:38 +0000 (0:00:00.023) 0:00:05.818 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:44:39 +0000 (0:00:00.876) 0:00:06.695 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:44:41 +0000 (0:00:01.864) 0:00:08.559 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.045) 0:00:08.605 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.028) 0:00:08.633 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.550) 0:00:09.184 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.032) 0:00:09.216 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.029) 0:00:09.245 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.037) 0:00:09.283 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.031) 0:00:09.314 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.031) 0:00:09.346 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.031) 0:00:09.377 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.031) 0:00:09.409 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:44:41 +0000 (0:00:00.029) 0:00:09.438 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:44:42 +0000 (0:00:00.028) 0:00:09.466 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:44:42 +0000 (0:00:00.493) 0:00:09.959 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:44:42 +0000 (0:00:00.031) 0:00:09.991 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:13 Wednesday 01 June 2022 17:44:43 +0000 (0:00:00.875) 0:00:10.866 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:20 Wednesday 01 June 2022 17:44:43 +0000 (0:00:00.030) 0:00:10.897 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:44:43 +0000 (0:00:00.046) 0:00:10.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "vdb", "vdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.559) 0:00:11.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "vdb", "vdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.035) 0:00:11.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.029) 0:00:11.568 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "vdb", "vdc" ] } TASK [Create a disk device with swap] ****************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:25 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.031) 0:00:11.599 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.060) 0:00:11.660 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.044) 0:00:11.705 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.608) 0:00:12.313 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.070) 0:00:12.383 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.030) 0:00:12.413 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:44:44 +0000 (0:00:00.029) 0:00:12.443 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.060) 0:00:12.503 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.026) 0:00:12.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.031) 0:00:12.561 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.034) 0:00:12.596 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_type": "swap", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.037) 0:00:12.633 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.034) 0:00:12.668 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.034) 0:00:12.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.034) 0:00:12.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.031) 0:00:12.768 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.045) 0:00:12.813 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:44:45 +0000 (0:00:00.028) 0:00:12.842 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "swap" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "present" } ], "packages": [ "xfsprogs", "dosfstools", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:44:47 +0000 (0:00:01.775) 0:00:14.617 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:44:47 +0000 (0:00:00.032) 0:00:14.650 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:44:47 +0000 (0:00:00.029) 0:00:14.679 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "swap" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "present" } ], "packages": [ "xfsprogs", "dosfstools", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:44:47 +0000 (0:00:00.039) 0:00:14.719 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:44:47 +0000 (0:00:00.036) 0:00:14.755 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:44:47 +0000 (0:00:00.039) 0:00:14.795 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:44:47 +0000 (0:00:00.033) 0:00:14.828 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:44:48 +0000 (0:00:01.063) 0:00:15.891 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075', u'state': u'present', u'dump': 0, u'path': u'none', u'passno': 0, u'opts': u'defaults', u'fstype': u'swap'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "present" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:44:48 +0000 (0:00:00.549) 0:00:16.441 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:44:49 +0000 (0:00:00.688) 0:00:17.129 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:44:50 +0000 (0:00:00.406) 0:00:17.535 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:44:50 +0000 (0:00:00.030) 0:00:17.566 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:35 Wednesday 01 June 2022 17:44:51 +0000 (0:00:00.902) 0:00:18.468 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:44:51 +0000 (0:00:00.053) 0:00:18.522 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:44:51 +0000 (0:00:00.032) 0:00:18.554 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:44:51 +0000 (0:00:00.039) 0:00:18.593 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "swap", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "c1cf3934-3f0d-4c57-8032-85e866ed0075" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:44:51 +0000 (0:00:00.488) 0:00:19.082 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002801", "end": "2022-06-01 13:44:51.407988", "rc": 0, "start": "2022-06-01 13:44:51.405187" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075 none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.480) 0:00:19.562 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003379", "end": "2022-06-01 13:44:51.793577", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:44:51.790198" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.392) 0:00:19.955 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.034) 0:00:19.990 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.034) 0:00:20.025 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.066) 0:00:20.091 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.036) 0:00:20.127 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.118) 0:00:20.246 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.037) 0:00:20.284 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "1" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.045) 0:00:20.330 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.036) 0:00:20.366 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.037) 0:00:20.403 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:44:52 +0000 (0:00:00.033) 0:00:20.437 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/vdb" ], "delta": "0:00:00.002901", "end": "2022-06-01 13:44:52.676117", "rc": 0, "start": "2022-06-01 13:44:52.673216" } STDOUT: /dev/vdb TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:44:53 +0000 (0:00:00.397) 0:00:20.835 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.002882", "end": "2022-06-01 13:44:53.073617", "rc": 0, "start": "2022-06-01 13:44:53.070735" } STDOUT: Filename Type Size Used Priority /dev/vdb partition 10485756 0 -2 TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:44:53 +0000 (0:00:00.435) 0:00:21.270 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:44:53 +0000 (0:00:00.040) 0:00:21.311 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:44:53 +0000 (0:00:00.032) 0:00:21.344 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [ "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075 " ], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:44:53 +0000 (0:00:00.048) 0:00:21.392 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:44:53 +0000 (0:00:00.034) 0:00:21.426 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.036) 0:00:21.462 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.033) 0:00:21.496 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.031) 0:00:21.528 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.036) 0:00:21.565 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.037) 0:00:21.602 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105486.3891215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105486.3561215, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105486.3561215, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.408) 0:00:22.010 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.036) 0:00:22.047 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.034) 0:00:22.082 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.036) 0:00:22.118 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.032) 0:00:22.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.040) 0:00:22.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.033) 0:00:22.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.030) 0:00:22.253 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.028) 0:00:22.282 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.037) 0:00:22.320 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.029) 0:00:22.349 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.031) 0:00:22.381 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:44:54 +0000 (0:00:00.031) 0:00:22.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.033) 0:00:22.445 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:22.477 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.041) 0:00:22.519 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.035) 0:00:22.554 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.028) 0:00:22.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.027) 0:00:22.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:22.642 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.030) 0:00:22.673 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:22.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.035) 0:00:22.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.031) 0:00:22.774 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:22.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.030) 0:00:22.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.030) 0:00:22.867 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.035) 0:00:22.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.030) 0:00:22.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:22.966 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.030) 0:00:22.996 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:23.028 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:23.060 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.034) 0:00:23.095 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.031) 0:00:23.127 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.031) 0:00:23.159 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:23.191 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.033) 0:00:23.224 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.033) 0:00:23.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.034) 0:00:23.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.031) 0:00:23.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:23.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.033) 0:00:23.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:44:55 +0000 (0:00:00.032) 0:00:23.422 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.038) 0:00:23.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.041) 0:00:23.501 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.033) 0:00:23.535 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.033) 0:00:23.569 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Format second disk as ext3] ********************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:37 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.030) 0:00:23.599 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.118) 0:00:23.718 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.046) 0:00:23.764 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.546) 0:00:24.311 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.072) 0:00:24.383 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:44:56 +0000 (0:00:00.032) 0:00:24.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.030) 0:00:24.446 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.064) 0:00:24.511 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.030) 0:00:24.541 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.032) 0:00:24.574 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.035) 0:00:24.610 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdc" ], "fs_type": "ext3", "mount_point": "none", "name": "test2", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.038) 0:00:24.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.032) 0:00:24.681 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.031) 0:00:24.713 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.033) 0:00:24.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.030) 0:00:24.777 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.041) 0:00:24.819 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:44:57 +0000 (0:00:00.028) 0:00:24.848 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/vdc", "fs_type": "ext3" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "e2fsprogs", "dosfstools", "mdadm", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdc", "_kernel_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "_raw_kernel_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:44:59 +0000 (0:00:02.006) 0:00:26.854 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.031) 0:00:26.885 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.029) 0:00:26.915 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/vdc", "fs_type": "ext3" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [], "packages": [ "e2fsprogs", "dosfstools", "mdadm", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdc", "_kernel_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "_raw_kernel_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.038) 0:00:26.953 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.034) 0:00:26.988 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdc", "_kernel_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "_raw_kernel_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.040) 0:00:27.028 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.030) 0:00:27.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.031) 0:00:27.090 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.030) 0:00:27.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:44:59 +0000 (0:00:00.030) 0:00:27.150 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:45:00 +0000 (0:00:00.415) 0:00:27.566 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:45:00 +0000 (0:00:00.029) 0:00:27.596 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:48 Wednesday 01 June 2022 17:45:01 +0000 (0:00:00.890) 0:00:28.487 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:45:01 +0000 (0:00:00.057) 0:00:28.544 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:45:01 +0000 (0:00:00.073) 0:00:28.618 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdc", "_kernel_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "_raw_kernel_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:45:01 +0000 (0:00:00.040) 0:00:28.658 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "swap", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "c1cf3934-3f0d-4c57-8032-85e866ed0075" }, "/dev/vdc": { "fstype": "ext3", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "d3b0d1d3-9dd7-486a-8010-23ffa060107a" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:45:01 +0000 (0:00:00.405) 0:00:29.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003002", "end": "2022-06-01 13:45:01.292505", "rc": 0, "start": "2022-06-01 13:45:01.289503" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075 none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.387) 0:00:29.451 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002934", "end": "2022-06-01 13:45:01.678127", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:45:01.675193" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.385) 0:00:29.836 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.028) 0:00:29.865 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.030) 0:00:29.895 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.062) 0:00:29.957 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.042) 0:00:30.000 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.118) 0:00:30.118 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdc" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.039) 0:00:30.158 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.043) 0:00:30.201 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.039) 0:00:30.241 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.036) 0:00:30.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.035) 0:00:30.313 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.038) 0:00:30.352 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.033) 0:00:30.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:45:02 +0000 (0:00:00.033) 0:00:30.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.033) 0:00:30.452 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.049) 0:00:30.501 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.033) 0:00:30.535 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.037) 0:00:30.572 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.029) 0:00:30.602 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.029) 0:00:30.632 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.036) 0:00:30.668 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.039) 0:00:30.708 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105498.6341214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105498.6341214, "dev": 5, "device_type": 64544, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 261, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105498.6341214, "nlink": 1, "path": "/dev/vdc", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.408) 0:00:31.117 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.037) 0:00:31.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.035) 0:00:31.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.032) 0:00:31.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.030) 0:00:31.253 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.035) 0:00:31.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.034) 0:00:31.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.032) 0:00:31.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.032) 0:00:31.388 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:45:03 +0000 (0:00:00.038) 0:00:31.426 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.032) 0:00:31.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.032) 0:00:31.490 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.035) 0:00:31.525 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.067) 0:00:31.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.030) 0:00:31.623 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.035) 0:00:31.659 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.035) 0:00:31.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.032) 0:00:31.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.032) 0:00:31.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.035) 0:00:31.796 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.032) 0:00:31.828 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.031) 0:00:31.859 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.030) 0:00:31.889 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.031) 0:00:31.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.031) 0:00:31.952 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.034) 0:00:31.987 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.030) 0:00:32.017 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.029) 0:00:32.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.031) 0:00:32.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.042) 0:00:32.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.035) 0:00:32.156 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.041) 0:00:32.198 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.034) 0:00:32.232 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.032) 0:00:32.265 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.031) 0:00:32.296 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.031) 0:00:32.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.031) 0:00:32.360 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.038) 0:00:32.398 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:45:04 +0000 (0:00:00.035) 0:00:32.434 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.031) 0:00:32.465 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.030) 0:00:32.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.032) 0:00:32.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.032) 0:00:32.561 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.035) 0:00:32.597 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.032) 0:00:32.630 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.032) 0:00:32.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.042) 0:00:32.705 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.035) 0:00:32.740 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change the disk device file system type to ext3] ************************* task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:50 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.033) 0:00:32.773 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.080) 0:00:32.854 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:45:05 +0000 (0:00:00.045) 0:00:32.900 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.563) 0:00:33.463 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.075) 0:00:33.538 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.032) 0:00:33.571 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.032) 0:00:33.604 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.062) 0:00:33.666 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.028) 0:00:33.694 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.032) 0:00:33.727 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.036) 0:00:33.764 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_type": "ext3", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.041) 0:00:33.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.032) 0:00:33.838 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.031) 0:00:33.870 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.032) 0:00:33.902 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.031) 0:00:33.933 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.051) 0:00:33.985 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:45:06 +0000 (0:00:00.076) 0:00:34.062 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/vdb", "fs_type": "swap" }, { "action": "create format", "device": "/dev/vdb", "fs_type": "ext3" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "absent" }, { "fstype": "swap", "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "absent" }, { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:45:08 +0000 (0:00:02.026) 0:00:36.088 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:45:08 +0000 (0:00:00.032) 0:00:36.121 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:45:08 +0000 (0:00:00.031) 0:00:36.152 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/vdb", "fs_type": "swap" }, { "action": "create format", "device": "/dev/vdb", "fs_type": "ext3" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "absent" }, { "fstype": "swap", "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "absent" }, { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:45:08 +0000 (0:00:00.040) 0:00:36.193 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:45:08 +0000 (0:00:00.033) 0:00:36.227 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:45:08 +0000 (0:00:00.035) 0:00:36.262 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075', u'state': u'absent', u'fstype': u'swap', u'path': u'none'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "fstype": "swap", "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "absent" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075" } ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075', u'state': u'absent', u'path': u'none', u'fstype': u'swap'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "fstype": "swap", "path": "none", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075", "state": "absent" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=c1cf3934-3f0d-4c57-8032-85e866ed0075" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:45:09 +0000 (0:00:00.784) 0:00:37.046 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:45:10 +0000 (0:00:00.676) 0:00:37.723 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext3'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:45:10 +0000 (0:00:00.462) 0:00:38.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:45:11 +0000 (0:00:00.650) 0:00:38.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:45:11 +0000 (0:00:00.387) 0:00:39.224 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:45:11 +0000 (0:00:00.032) 0:00:39.257 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:61 Wednesday 01 June 2022 17:45:12 +0000 (0:00:00.892) 0:00:40.149 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:45:12 +0000 (0:00:00.063) 0:00:40.213 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:45:12 +0000 (0:00:00.033) 0:00:40.247 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:45:12 +0000 (0:00:00.041) 0:00:40.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "ext3", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "e9bfedba-521d-45ed-b9e4-9a7372c495b6" }, "/dev/vdc": { "fstype": "ext3", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "d3b0d1d3-9dd7-486a-8010-23ffa060107a" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:45:13 +0000 (0:00:00.423) 0:00:40.711 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002986", "end": "2022-06-01 13:45:12.929400", "rc": 0, "start": "2022-06-01 13:45:12.926414" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6 /opt/test ext3 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:45:13 +0000 (0:00:00.386) 0:00:41.098 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002948", "end": "2022-06-01 13:45:13.330809", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:45:13.327861" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.391) 0:00:41.489 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.031) 0:00:41.520 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.033) 0:00:41.554 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.064) 0:00:41.619 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.040) 0:00:41.659 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.119) 0:00:41.779 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.036) 0:00:41.815 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2427072, "block_size": 4096, "block_total": 2558167, "block_used": 131095, "device": "/dev/vdb", "fstype": "ext3", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9941286912, "size_total": 10478252032, "uuid": "e9bfedba-521d-45ed-b9e4-9a7372c495b6" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2427072, "block_size": 4096, "block_total": 2558167, "block_used": 131095, "device": "/dev/vdb", "fstype": "ext3", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9941286912, "size_total": 10478252032, "uuid": "e9bfedba-521d-45ed-b9e4-9a7372c495b6" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.047) 0:00:41.862 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.039) 0:00:41.901 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.039) 0:00:41.941 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.037) 0:00:41.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.029) 0:00:42.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.032) 0:00:42.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.032) 0:00:42.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.033) 0:00:42.106 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext3 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.054) 0:00:42.160 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.036) 0:00:42.197 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.040) 0:00:42.238 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.030) 0:00:42.269 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.031) 0:00:42.301 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.036) 0:00:42.337 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:45:14 +0000 (0:00:00.040) 0:00:42.377 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105507.8471215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105507.8471215, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105507.8471215, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.413) 0:00:42.791 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.038) 0:00:42.830 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.037) 0:00:42.867 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.035) 0:00:42.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.032) 0:00:42.935 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.036) 0:00:42.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.035) 0:00:43.008 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.032) 0:00:43.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.032) 0:00:43.073 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.039) 0:00:43.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.032) 0:00:43.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.031) 0:00:43.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.035) 0:00:43.212 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.032) 0:00:43.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.033) 0:00:43.278 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.040) 0:00:43.319 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.037) 0:00:43.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.034) 0:00:43.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:45:15 +0000 (0:00:00.035) 0:00:43.427 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.031) 0:00:43.458 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.032) 0:00:43.491 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.031) 0:00:43.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.036) 0:00:43.558 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.039) 0:00:43.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.041) 0:00:43.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.080) 0:00:43.720 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.032) 0:00:43.753 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.031) 0:00:43.785 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.033) 0:00:43.818 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.033) 0:00:43.852 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.034) 0:00:43.886 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.038) 0:00:43.925 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.033) 0:00:43.958 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.034) 0:00:43.992 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.033) 0:00:44.025 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.033) 0:00:44.059 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.038) 0:00:44.097 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.039) 0:00:44.136 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.034) 0:00:44.171 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.031) 0:00:44.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.032) 0:00:44.235 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.030) 0:00:44.266 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.032) 0:00:44.299 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.035) 0:00:44.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.032) 0:00:44.367 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.031) 0:00:44.399 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:45:16 +0000 (0:00:00.030) 0:00:44.429 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.030) 0:00:44.460 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:63 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.030) 0:00:44.490 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.081) 0:00:44.572 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.046) 0:00:44.618 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.535) 0:00:45.154 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.075) 0:00:45.229 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.033) 0:00:45.263 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.033) 0:00:45.296 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.073) 0:00:45.370 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.027) 0:00:45.398 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:45:17 +0000 (0:00:00.035) 0:00:45.434 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.039) 0:00:45.473 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_type": "ext3", "mount_point": "/opt/test", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.041) 0:00:45.515 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.031) 0:00:45.546 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.032) 0:00:45.579 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.031) 0:00:45.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.029) 0:00:45.639 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.045) 0:00:45.685 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:45:18 +0000 (0:00:00.032) 0:00:45.717 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:45:19 +0000 (0:00:01.565) 0:00:47.283 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:45:19 +0000 (0:00:00.030) 0:00:47.313 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:45:19 +0000 (0:00:00.028) 0:00:47.342 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:45:19 +0000 (0:00:00.039) 0:00:47.381 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:45:19 +0000 (0:00:00.036) 0:00:47.418 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:45:20 +0000 (0:00:00.038) 0:00:47.456 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:45:20 +0000 (0:00:00.030) 0:00:47.487 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:45:20 +0000 (0:00:00.680) 0:00:48.168 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext3'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "dump": 0, "fstype": "ext3", "opts": "defaults", "passno": 0, "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "mounted" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:45:21 +0000 (0:00:00.394) 0:00:48.562 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:45:21 +0000 (0:00:00.683) 0:00:49.246 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:45:22 +0000 (0:00:00.379) 0:00:49.625 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:45:22 +0000 (0:00:00.028) 0:00:49.654 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:74 Wednesday 01 June 2022 17:45:23 +0000 (0:00:00.852) 0:00:50.507 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:45:23 +0000 (0:00:00.066) 0:00:50.574 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:45:23 +0000 (0:00:00.031) 0:00:50.605 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:45:23 +0000 (0:00:00.040) 0:00:50.645 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "ext3", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "e9bfedba-521d-45ed-b9e4-9a7372c495b6" }, "/dev/vdc": { "fstype": "ext3", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "d3b0d1d3-9dd7-486a-8010-23ffa060107a" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:45:23 +0000 (0:00:00.400) 0:00:51.046 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.004392", "end": "2022-06-01 13:45:23.273133", "rc": 0, "start": "2022-06-01 13:45:23.268741" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6 /opt/test ext3 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.398) 0:00:51.445 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003008", "end": "2022-06-01 13:45:23.665445", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:45:23.662437" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.385) 0:00:51.830 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.031) 0:00:51.862 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.033) 0:00:51.895 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.097) 0:00:51.992 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.039) 0:00:52.032 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.117) 0:00:52.149 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.040) 0:00:52.190 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2427072, "block_size": 4096, "block_total": 2558167, "block_used": 131095, "device": "/dev/vdb", "fstype": "ext3", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9941286912, "size_total": 10478252032, "uuid": "e9bfedba-521d-45ed-b9e4-9a7372c495b6" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2427072, "block_size": 4096, "block_total": 2558167, "block_used": 131095, "device": "/dev/vdb", "fstype": "ext3", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test", "options": "rw,seclabel,relatime", "size_available": 9941286912, "size_total": 10478252032, "uuid": "e9bfedba-521d-45ed-b9e4-9a7372c495b6" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.042) 0:00:52.232 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.040) 0:00:52.273 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.040) 0:00:52.313 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.039) 0:00:52.353 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.031) 0:00:52.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:45:24 +0000 (0:00:00.031) 0:00:52.417 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.031) 0:00:52.448 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.032) 0:00:52.481 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test ext3 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.051) 0:00:52.532 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.035) 0:00:52.568 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.038) 0:00:52.606 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.031) 0:00:52.638 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.031) 0:00:52.669 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.040) 0:00:52.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.042) 0:00:52.752 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105507.8471215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105507.8471215, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105507.8471215, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.403) 0:00:53.155 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.037) 0:00:53.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.038) 0:00:53.232 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.036) 0:00:53.269 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.033) 0:00:53.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.037) 0:00:53.340 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.035) 0:00:53.375 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.033) 0:00:53.408 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:45:25 +0000 (0:00:00.032) 0:00:53.441 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.039) 0:00:53.480 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.033) 0:00:53.514 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.032) 0:00:53.547 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.035) 0:00:53.582 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.032) 0:00:53.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.032) 0:00:53.648 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.040) 0:00:53.688 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.036) 0:00:53.724 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.030) 0:00:53.754 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.034) 0:00:53.788 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.032) 0:00:53.821 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.032) 0:00:53.853 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.034) 0:00:53.888 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.033) 0:00:53.921 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.034) 0:00:53.956 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.035) 0:00:53.991 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.033) 0:00:54.025 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.032) 0:00:54.058 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.034) 0:00:54.093 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.033) 0:00:54.126 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.041) 0:00:54.167 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.039) 0:00:54.207 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.041) 0:00:54.248 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.033) 0:00:54.281 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.032) 0:00:54.314 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.031) 0:00:54.345 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.030) 0:00:54.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:45:26 +0000 (0:00:00.033) 0:00:54.409 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.092) 0:00:54.502 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.036) 0:00:54.538 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.030) 0:00:54.569 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.032) 0:00:54.601 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.032) 0:00:54.634 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.033) 0:00:54.667 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.035) 0:00:54.702 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.032) 0:00:54.734 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.032) 0:00:54.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.031) 0:00:54.798 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.032) 0:00:54.831 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Change it back to swap] ************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:76 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.032) 0:00:54.863 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.087) 0:00:54.951 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:45:27 +0000 (0:00:00.049) 0:00:55.000 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.580) 0:00:55.581 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.070) 0:00:55.651 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.031) 0:00:55.683 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.034) 0:00:55.717 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.063) 0:00:55.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.027) 0:00:55.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.030) 0:00:55.839 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.038) 0:00:55.877 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_type": "swap", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.038) 0:00:55.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.033) 0:00:55.948 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.032) 0:00:55.980 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.030) 0:00:56.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.032) 0:00:56.044 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.049) 0:00:56.093 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:45:28 +0000 (0:00:00.030) 0:00:56.124 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/vdb", "fs_type": "ext3" }, { "action": "create format", "device": "/dev/vdb", "fs_type": "swap" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext3", "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "absent" }, { "path": "/opt/test", "state": "absent" }, { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "present" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:45:30 +0000 (0:00:01.925) 0:00:58.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:45:30 +0000 (0:00:00.033) 0:00:58.083 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:45:30 +0000 (0:00:00.030) 0:00:58.114 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/vdb", "fs_type": "ext3" }, { "action": "create format", "device": "/dev/vdb", "fs_type": "swap" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext3", "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "absent" }, { "path": "/opt/test", "state": "absent" }, { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "present" } ], "packages": [ "mdadm", "xfsprogs", "dosfstools", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:45:30 +0000 (0:00:00.042) 0:00:58.156 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:45:30 +0000 (0:00:00.035) 0:00:58.192 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:45:30 +0000 (0:00:00.038) 0:00:58.230 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6', u'state': u'absent', u'fstype': u'ext3', u'path': u'/opt/test'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext3", "mount_info": { "fstype": "ext3", "path": "/opt/test", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0", "src": "UUID=e9bfedba-521d-45ed-b9e4-9a7372c495b6" } ok: [/cache/rhel-x.qcow2] => (item={u'path': u'/opt/test', u'state': u'absent'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test", "state": "absent" }, "name": "/opt/test", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:45:31 +0000 (0:00:00.745) 0:00:58.976 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:45:32 +0000 (0:00:00.671) 0:00:59.648 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1', u'state': u'present', u'dump': 0, u'path': u'none', u'passno': 0, u'opts': u'defaults', u'fstype': u'swap'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "present" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:45:32 +0000 (0:00:00.416) 0:01:00.064 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:45:33 +0000 (0:00:00.724) 0:01:00.788 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:45:33 +0000 (0:00:00.404) 0:01:01.193 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:45:33 +0000 (0:00:00.031) 0:01:01.225 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:86 Wednesday 01 June 2022 17:45:34 +0000 (0:00:00.887) 0:01:02.113 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:45:34 +0000 (0:00:00.070) 0:01:02.184 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:45:34 +0000 (0:00:00.031) 0:01:02.215 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:45:34 +0000 (0:00:00.039) 0:01:02.254 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "swap", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1" }, "/dev/vdc": { "fstype": "ext3", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "d3b0d1d3-9dd7-486a-8010-23ffa060107a" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:45:35 +0000 (0:00:00.382) 0:01:02.637 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002825", "end": "2022-06-01 13:45:34.875738", "rc": 0, "start": "2022-06-01 13:45:34.872913" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1 none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:45:35 +0000 (0:00:00.394) 0:01:03.031 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002963", "end": "2022-06-01 13:45:35.247558", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:45:35.244595" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:45:35 +0000 (0:00:00.374) 0:01:03.406 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:45:35 +0000 (0:00:00.029) 0:01:03.435 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.031) 0:01:03.467 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.063) 0:01:03.531 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.035) 0:01:03.566 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.114) 0:01:03.681 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.043) 0:01:03.724 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "1" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.040) 0:01:03.765 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.035) 0:01:03.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.035) 0:01:03.837 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.031) 0:01:03.868 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/vdb" ], "delta": "0:00:00.003341", "end": "2022-06-01 13:45:36.123491", "rc": 0, "start": "2022-06-01 13:45:36.120150" } STDOUT: /dev/vdb TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:45:36 +0000 (0:00:00.418) 0:01:04.286 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.003013", "end": "2022-06-01 13:45:36.521690", "rc": 0, "start": "2022-06-01 13:45:36.518677" } STDOUT: Filename Type Size Used Priority /dev/vdb partition 10485756 0 -2 TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.395) 0:01:04.681 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.040) 0:01:04.722 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.031) 0:01:04.754 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [ "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1 " ], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.047) 0:01:04.801 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.035) 0:01:04.836 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.036) 0:01:04.872 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.032) 0:01:04.905 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.034) 0:01:04.939 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.038) 0:01:04.978 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.041) 0:01:05.019 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105529.8371215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105529.8061216, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105529.8061216, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:45:37 +0000 (0:00:00.412) 0:01:05.432 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.035) 0:01:05.468 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.035) 0:01:05.503 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.033) 0:01:05.537 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.029) 0:01:05.567 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.034) 0:01:05.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.030) 0:01:05.632 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.028) 0:01:05.661 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.031) 0:01:05.692 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.034) 0:01:05.727 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.028) 0:01:05.755 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.030) 0:01:05.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.032) 0:01:05.819 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.073) 0:01:05.892 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.033) 0:01:05.925 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.053) 0:01:05.979 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.036) 0:01:06.016 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.031) 0:01:06.048 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.032) 0:01:06.081 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.049) 0:01:06.130 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.049) 0:01:06.180 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.050) 0:01:06.231 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.039) 0:01:06.271 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.033) 0:01:06.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.033) 0:01:06.339 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.037) 0:01:06.376 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:45:38 +0000 (0:00:00.035) 0:01:06.412 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.035) 0:01:06.447 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.032) 0:01:06.479 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.030) 0:01:06.510 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.031) 0:01:06.542 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.045) 0:01:06.587 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.033) 0:01:06.621 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.031) 0:01:06.652 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.031) 0:01:06.683 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.032) 0:01:06.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.034) 0:01:06.751 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.040) 0:01:06.791 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.034) 0:01:06.826 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.030) 0:01:06.856 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.029) 0:01:06.886 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.032) 0:01:06.918 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.032) 0:01:06.950 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.033) 0:01:06.984 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.030) 0:01:07.015 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.029) 0:01:07.044 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.028) 0:01:07.073 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.030) 0:01:07.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:88 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.030) 0:01:07.134 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.092) 0:01:07.226 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:45:39 +0000 (0:00:00.047) 0:01:07.274 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.537) 0:01:07.811 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.074) 0:01:07.885 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.030) 0:01:07.916 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.031) 0:01:07.948 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.061) 0:01:08.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.028) 0:01:08.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.032) 0:01:08.070 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.037) 0:01:08.108 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_type": "swap", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.038) 0:01:08.146 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.030) 0:01:08.176 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.028) 0:01:08.205 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.028) 0:01:08.234 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.031) 0:01:08.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.049) 0:01:08.315 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:45:40 +0000 (0:00:00.074) 0:01:08.390 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "absent" }, { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "present" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:45:42 +0000 (0:00:01.526) 0:01:09.916 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:45:42 +0000 (0:00:00.031) 0:01:09.947 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:45:42 +0000 (0:00:00.033) 0:01:09.980 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "absent" }, { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "present" } ], "packages": [ "dosfstools", "xfsprogs", "e2fsprogs", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:45:42 +0000 (0:00:00.043) 0:01:10.023 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:45:42 +0000 (0:00:00.037) 0:01:10.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:45:42 +0000 (0:00:00.037) 0:01:10.099 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1', u'state': u'absent', u'path': u'none', u'fstype': u'swap'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "fstype": "swap", "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "absent" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:45:43 +0000 (0:00:00.419) 0:01:10.518 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:45:43 +0000 (0:00:00.687) 0:01:11.205 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1', u'state': u'present', u'dump': 0, u'path': u'none', u'passno': 0, u'opts': u'defaults', u'fstype': u'swap'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "dump": 0, "fstype": "swap", "opts": "defaults", "passno": 0, "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "present" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:45:44 +0000 (0:00:00.399) 0:01:11.605 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:45:44 +0000 (0:00:00.682) 0:01:12.288 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:45:45 +0000 (0:00:00.441) 0:01:12.729 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:45:45 +0000 (0:00:00.030) 0:01:12.759 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:98 Wednesday 01 June 2022 17:45:46 +0000 (0:00:00.882) 0:01:13.642 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:45:46 +0000 (0:00:00.074) 0:01:13.716 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:45:46 +0000 (0:00:00.033) 0:01:13.750 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:45:46 +0000 (0:00:00.039) 0:01:13.789 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "swap", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1" }, "/dev/vdc": { "fstype": "ext3", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "d3b0d1d3-9dd7-486a-8010-23ffa060107a" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:45:46 +0000 (0:00:00.403) 0:01:14.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002872", "end": "2022-06-01 13:45:46.418353", "rc": 0, "start": "2022-06-01 13:45:46.415481" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1 none swap defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.385) 0:01:14.578 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003375", "end": "2022-06-01 13:45:46.808893", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:45:46.805518" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.393) 0:01:14.971 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.029) 0:01:15.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.031) 0:01:15.033 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.070) 0:01:15.103 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.037) 0:01:15.140 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.113) 0:01:15.254 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.038) 0:01:15.292 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "1" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.045) 0:01:15.338 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.036) 0:01:15.374 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:45:47 +0000 (0:00:00.039) 0:01:15.414 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:45:48 +0000 (0:00:00.030) 0:01:15.445 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/vdb" ], "delta": "0:00:00.002855", "end": "2022-06-01 13:45:47.690454", "rc": 0, "start": "2022-06-01 13:45:47.687599" } STDOUT: /dev/vdb TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:45:48 +0000 (0:00:00.403) 0:01:15.849 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.002951", "end": "2022-06-01 13:45:48.082289", "rc": 0, "start": "2022-06-01 13:45:48.079338" } STDOUT: Filename Type Size Used Priority /dev/vdb partition 10485756 0 -2 TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:45:48 +0000 (0:00:00.393) 0:01:16.242 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:45:48 +0000 (0:00:00.044) 0:01:16.286 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:45:48 +0000 (0:00:00.037) 0:01:16.324 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [ "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1 " ], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:45:48 +0000 (0:00:00.048) 0:01:16.373 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:45:48 +0000 (0:00:00.040) 0:01:16.413 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.039) 0:01:16.453 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.032) 0:01:16.485 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.031) 0:01:16.517 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.042) 0:01:16.560 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.045) 0:01:16.605 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105529.8371215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105529.8061216, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105529.8061216, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.412) 0:01:17.017 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.040) 0:01:17.058 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.037) 0:01:17.095 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.033) 0:01:17.128 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.028) 0:01:17.157 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.034) 0:01:17.191 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.029) 0:01:17.220 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.031) 0:01:17.252 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.032) 0:01:17.284 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.040) 0:01:17.324 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.032) 0:01:17.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.031) 0:01:17.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:45:49 +0000 (0:00:00.030) 0:01:17.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:17.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.078) 0:01:17.530 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.039) 0:01:17.570 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.037) 0:01:17.608 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:17.640 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:17.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:17.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.035) 0:01:17.740 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:17.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.034) 0:01:17.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:17.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.033) 0:01:17.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.031) 0:01:17.904 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.035) 0:01:17.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.033) 0:01:17.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:18.007 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.033) 0:01:18.040 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.033) 0:01:18.074 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.035) 0:01:18.109 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.036) 0:01:18.145 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.032) 0:01:18.178 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.033) 0:01:18.211 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.031) 0:01:18.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.031) 0:01:18.275 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.036) 0:01:18.311 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.039) 0:01:18.350 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.033) 0:01:18.384 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:45:50 +0000 (0:00:00.033) 0:01:18.418 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.034) 0:01:18.452 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.032) 0:01:18.485 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.031) 0:01:18.517 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.034) 0:01:18.551 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.032) 0:01:18.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.033) 0:01:18.617 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.032) 0:01:18.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:100 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.030) 0:01:18.680 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.117) 0:01:18.798 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.049) 0:01:18.847 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:45:51 +0000 (0:00:00.552) 0:01:19.400 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.075) 0:01:19.476 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.031) 0:01:19.508 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.035) 0:01:19.543 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.063) 0:01:19.607 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.028) 0:01:19.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.033) 0:01:19.669 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.035) 0:01:19.704 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "mount_point": "/opt/test", "name": "test1", "state": "absent", "type": "disk" }, { "disks": [ "vdc" ], "mount_point": "none", "name": "test2", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.037) 0:01:19.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.030) 0:01:19.772 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.030) 0:01:19.802 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.028) 0:01:19.831 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.028) 0:01:19.860 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.099) 0:01:19.960 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:45:52 +0000 (0:00:00.030) 0:01:19.990 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/vdc", "fs_type": "ext3" }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "swap" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "absent" } ], "packages": [ "xfsprogs", "mdadm", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:45:54 +0000 (0:00:02.005) 0:01:21.996 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:45:54 +0000 (0:00:00.029) 0:01:22.026 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:45:54 +0000 (0:00:00.029) 0:01:22.055 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/vdc", "fs_type": "ext3" }, { "action": "destroy format", "device": "/dev/vdb", "fs_type": "swap" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "swap", "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "absent" } ], "packages": [ "xfsprogs", "mdadm", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:45:54 +0000 (0:00:00.041) 0:01:22.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:45:54 +0000 (0:00:00.037) 0:01:22.134 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:45:54 +0000 (0:00:00.040) 0:01:22.175 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1', u'state': u'absent', u'fstype': u'swap', u'path': u'none'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "swap", "mount_info": { "fstype": "swap", "path": "none", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "state": "absent" }, "name": "none", "opts": "defaults", "passno": "0", "src": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:45:55 +0000 (0:00:00.396) 0:01:22.571 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:45:55 +0000 (0:00:00.710) 0:01:23.282 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:45:55 +0000 (0:00:00.033) 0:01:23.315 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:45:56 +0000 (0:00:00.693) 0:01:24.008 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:45:56 +0000 (0:00:00.407) 0:01:24.415 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:45:57 +0000 (0:00:00.031) 0:01:24.447 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:116 Wednesday 01 June 2022 17:45:57 +0000 (0:00:00.903) 0:01:25.351 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:45:57 +0000 (0:00:00.091) 0:01:25.442 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:45:58 +0000 (0:00:00.033) 0:01:25.476 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_mount_id": "UUID=4f7d3e4b-8958-42bb-b8ed-d24fe2327cf1", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "swap", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null }, { "_device": "/dev/vdc", "_mount_id": "UUID=d3b0d1d3-9dd7-486a-8010-23ffa060107a", "_raw_device": "/dev/vdc", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext3", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "none", "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:45:58 +0000 (0:00:00.046) 0:01:25.522 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:45:58 +0000 (0:00:00.391) 0:01:25.914 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002943", "end": "2022-06-01 13:45:58.152625", "rc": 0, "start": "2022-06-01 13:45:58.149682" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:45:58 +0000 (0:00:00.397) 0:01:26.311 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003014", "end": "2022-06-01 13:45:58.534810", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:45:58.531796" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.384) 0:01:26.696 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.029) 0:01:26.726 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.032) 0:01:26.759 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.081) 0:01:26.841 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.035) 0:01:26.876 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.113) 0:01:26.989 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.036) 0:01:27.026 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.043) 0:01:27.070 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.036) 0:01:27.107 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.038) 0:01:27.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:45:59 +0000 (0:00:00.032) 0:01:27.178 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "realpath", "/dev/vdb" ], "delta": "0:00:00.003768", "end": "2022-06-01 13:45:59.416419", "rc": 0, "start": "2022-06-01 13:45:59.412651" } STDOUT: /dev/vdb TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.399) 0:01:27.577 ******** changed: [/cache/rhel-x.qcow2] => { "changed": true, "cmd": [ "cat", "/proc/swaps" ], "delta": "0:00:00.002885", "end": "2022-06-01 13:45:59.812217", "rc": 0, "start": "2022-06-01 13:45:59.809332" } STDOUT: Filename Type Size Used Priority TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.399) 0:01:27.977 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.049) 0:01:28.027 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.031) 0:01:28.059 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.049) 0:01:28.108 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.028) 0:01:28.136 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.040) 0:01:28.177 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.031) 0:01:28.208 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.035) 0:01:28.243 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.034) 0:01:28.278 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:46:00 +0000 (0:00:00.027) 0:01:28.305 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105553.7551215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105553.7551215, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105553.7551215, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.404) 0:01:28.709 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.037) 0:01:28.746 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.028) 0:01:28.774 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.037) 0:01:28.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.033) 0:01:28.846 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.030) 0:01:28.876 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.031) 0:01:28.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.032) 0:01:28.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.032) 0:01:28.973 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.028) 0:01:29.002 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.032) 0:01:29.034 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.034) 0:01:29.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.083) 0:01:29.152 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.033) 0:01:29.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.034) 0:01:29.220 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.040) 0:01:29.260 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.035) 0:01:29.296 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.032) 0:01:29.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.034) 0:01:29.363 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.033) 0:01:29.396 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:46:01 +0000 (0:00:00.033) 0:01:29.430 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.032) 0:01:29.463 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.033) 0:01:29.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.032) 0:01:29.529 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.036) 0:01:29.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.033) 0:01:29.598 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.036) 0:01:29.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.037) 0:01:29.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.033) 0:01:29.706 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.030) 0:01:29.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.034) 0:01:29.771 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.035) 0:01:29.807 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.033) 0:01:29.841 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.032) 0:01:29.873 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.030) 0:01:29.903 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.030) 0:01:29.933 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.036) 0:01:29.970 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.042) 0:01:30.013 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.036) 0:01:30.050 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.031) 0:01:30.082 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.031) 0:01:30.113 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.031) 0:01:30.145 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.034) 0:01:30.179 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.033) 0:01:30.213 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.032) 0:01:30.245 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.033) 0:01:30.279 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.034) 0:01:30.313 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.038) 0:01:30.351 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:46:02 +0000 (0:00:00.041) 0:01:30.392 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.119) 0:01:30.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdc" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.036) 0:01:30.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.044) 0:01:30.593 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.030) 0:01:30.623 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.037) 0:01:30.660 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.031) 0:01:30.692 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.034) 0:01:30.726 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.033) 0:01:30.760 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.032) 0:01:30.792 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.034) 0:01:30.826 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.049) 0:01:30.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.027) 0:01:30.903 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.039) 0:01:30.942 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.030) 0:01:30.973 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.030) 0:01:31.003 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.031) 0:01:31.035 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:46:03 +0000 (0:00:00.026) 0:01:31.062 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105553.7111216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105553.7111216, "dev": 5, "device_type": 64544, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 261, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105553.7111216, "nlink": 1, "path": "/dev/vdc", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.408) 0:01:31.471 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.038) 0:01:31.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.071) 0:01:31.580 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.034) 0:01:31.615 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.032) 0:01:31.648 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.027) 0:01:31.676 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.033) 0:01:31.709 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.033) 0:01:31.742 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.034) 0:01:31.777 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.030) 0:01:31.807 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.032) 0:01:31.839 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.031) 0:01:31.871 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.031) 0:01:31.903 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.030) 0:01:31.934 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.036) 0:01:31.970 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.039) 0:01:32.011 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.038) 0:01:32.049 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.029) 0:01:32.078 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.033) 0:01:32.111 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.031) 0:01:32.143 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.039) 0:01:32.182 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.042) 0:01:32.225 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.032) 0:01:32.258 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.034) 0:01:32.293 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.031) 0:01:32.325 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.031) 0:01:32.357 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.033) 0:01:32.390 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:46:04 +0000 (0:00:00.035) 0:01:32.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.034) 0:01:32.460 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.032) 0:01:32.492 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.035) 0:01:32.528 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.037) 0:01:32.566 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.036) 0:01:32.603 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.032) 0:01:32.635 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.033) 0:01:32.669 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.038) 0:01:32.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.031) 0:01:32.739 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.034) 0:01:32.774 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.034) 0:01:32.809 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.034) 0:01:32.843 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.031) 0:01:32.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.030) 0:01:32.905 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.030) 0:01:32.935 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.029) 0:01:32.965 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.032) 0:01:32.998 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.031) 0:01:33.030 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.031) 0:01:33.061 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.032) 0:01:33.093 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=471 changed=21 unreachable=0 failed=0 skipped=427 rescued=0 ignored=0 Wednesday 01 June 2022 17:46:05 +0000 (0:00:00.015) 0:01:33.109 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.03s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.01s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.93s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.78s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.57s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.41s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.13s /tmp/tmp7247_7fr/tests/tests_swap.yml:2 --------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.11s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.06s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:46:06 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:46:07 +0000 (0:00:01.352) 0:00:01.375 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_swap_nvme_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_swap_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:46:07 +0000 (0:00:00.022) 0:00:01.398 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.35s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:46:08 +0000 (0:00:00.023) 0:00:00.023 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:46:09 +0000 (0:00:01.340) 0:00:01.363 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_swap_scsi_generated.yml **************************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_swap_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_swap_scsi_generated.yml:3 Wednesday 01 June 2022 17:46:09 +0000 (0:00:00.019) 0:00:01.383 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_swap_scsi_generated.yml:7 Wednesday 01 June 2022 17:46:11 +0000 (0:00:01.110) 0:00:02.493 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:2 Wednesday 01 June 2022 17:46:11 +0000 (0:00:00.028) 0:00:02.522 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:10 Wednesday 01 June 2022 17:46:11 +0000 (0:00:00.827) 0:00:03.349 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:46:11 +0000 (0:00:00.043) 0:00:03.393 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:46:12 +0000 (0:00:00.161) 0:00:03.554 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:46:12 +0000 (0:00:00.554) 0:00:04.108 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:46:12 +0000 (0:00:00.080) 0:00:04.189 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:46:12 +0000 (0:00:00.024) 0:00:04.214 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:46:12 +0000 (0:00:00.024) 0:00:04.239 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:46:13 +0000 (0:00:00.209) 0:00:04.448 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:46:13 +0000 (0:00:00.020) 0:00:04.469 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:46:14 +0000 (0:00:01.081) 0:00:05.550 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:46:14 +0000 (0:00:00.048) 0:00:05.598 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:46:14 +0000 (0:00:00.049) 0:00:05.647 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:46:14 +0000 (0:00:00.703) 0:00:06.351 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:46:14 +0000 (0:00:00.080) 0:00:06.432 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:46:15 +0000 (0:00:00.023) 0:00:06.456 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:46:15 +0000 (0:00:00.025) 0:00:06.482 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:46:15 +0000 (0:00:00.025) 0:00:06.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:46:15 +0000 (0:00:00.831) 0:00:07.338 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:46:17 +0000 (0:00:01.822) 0:00:09.161 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:46:17 +0000 (0:00:00.047) 0:00:09.208 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:46:17 +0000 (0:00:00.030) 0:00:09.238 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.547) 0:00:09.786 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.031) 0:00:09.817 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.027) 0:00:09.845 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.036) 0:00:09.881 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.035) 0:00:09.917 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.037) 0:00:09.954 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.028) 0:00:09.983 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.032) 0:00:10.015 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.031) 0:00:10.046 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:46:18 +0000 (0:00:00.032) 0:00:10.079 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:46:19 +0000 (0:00:00.534) 0:00:10.614 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:46:19 +0000 (0:00:00.030) 0:00:10.644 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:13 Wednesday 01 June 2022 17:46:20 +0000 (0:00:00.880) 0:00:11.524 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_swap.yml:20 Wednesday 01 June 2022 17:46:20 +0000 (0:00:00.029) 0:00:11.554 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:46:20 +0000 (0:00:00.048) 0:00:11.603 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": "Unable to find unused disk" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:46:20 +0000 (0:00:00.554) 0:00:12.157 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:46:20 +0000 (0:00:00.031) 0:00:12.189 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: Unable to find enough unused disks. Exiting playbook. PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=27 changed=0 unreachable=0 failed=1 skipped=13 rescued=0 ignored=0 Wednesday 01 June 2022 17:46:20 +0000 (0:00:00.019) 0:00:12.209 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.82s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_swap_scsi_generated.yml:3 ------------------------ linux-system-roles.storage : make sure blivet is available -------------- 1.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.83s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Gathering Facts --------------------------------------------------------- 0.83s /tmp/tmp7247_7fr/tests/tests_swap.yml:2 --------------------------------------- linux-system-roles.storage : get required packages ---------------------- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Find unused disks in the system ----------------------------------------- 0.55s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.21s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.16s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 include_tasks ----------------------------------------------------------- 0.05s /tmp/tmp7247_7fr/tests/tests_swap.yml:20 -------------------------------------- linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:46:21 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:46:22 +0000 (0:00:01.358) 0:00:01.381 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_volume_relabel.yml ********************************************* 1 plays in /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:2 Wednesday 01 June 2022 17:46:22 +0000 (0:00:00.015) 0:00:01.396 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:10 Wednesday 01 June 2022 17:46:24 +0000 (0:00:01.109) 0:00:02.506 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:46:24 +0000 (0:00:00.041) 0:00:02.547 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:46:24 +0000 (0:00:00.162) 0:00:02.710 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:46:24 +0000 (0:00:00.628) 0:00:03.339 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:46:24 +0000 (0:00:00.076) 0:00:03.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:46:24 +0000 (0:00:00.022) 0:00:03.438 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:46:24 +0000 (0:00:00.021) 0:00:03.460 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:46:25 +0000 (0:00:00.197) 0:00:03.657 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:46:25 +0000 (0:00:00.020) 0:00:03.678 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:46:26 +0000 (0:00:01.172) 0:00:04.850 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:46:26 +0000 (0:00:00.048) 0:00:04.899 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:46:26 +0000 (0:00:00.047) 0:00:04.946 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:46:27 +0000 (0:00:00.721) 0:00:05.668 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:46:27 +0000 (0:00:00.083) 0:00:05.751 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:46:27 +0000 (0:00:00.021) 0:00:05.773 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:46:27 +0000 (0:00:00.024) 0:00:05.797 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:46:27 +0000 (0:00:00.021) 0:00:05.819 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:46:28 +0000 (0:00:00.895) 0:00:06.715 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:46:30 +0000 (0:00:01.883) 0:00:08.598 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.045) 0:00:08.644 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.066) 0:00:08.710 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.537) 0:00:09.248 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.029) 0:00:09.278 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.026) 0:00:09.305 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.034) 0:00:09.339 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.036) 0:00:09.376 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.032) 0:00:09.408 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.029) 0:00:09.437 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:46:30 +0000 (0:00:00.029) 0:00:09.467 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:46:31 +0000 (0:00:00.029) 0:00:09.496 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:46:31 +0000 (0:00:00.032) 0:00:09.528 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:46:31 +0000 (0:00:00.486) 0:00:10.015 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:46:31 +0000 (0:00:00.044) 0:00:10.060 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:13 Wednesday 01 June 2022 17:46:32 +0000 (0:00:00.866) 0:00:10.926 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:20 Wednesday 01 June 2022 17:46:32 +0000 (0:00:00.035) 0:00:10.962 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:46:32 +0000 (0:00:00.090) 0:00:11.052 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": [ "vdb" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.598) 0:00:11.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "unused_disks": [ "vdb" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.036) 0:00:11.687 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:19 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.030) 0:00:11.718 ******** ok: [/cache/rhel-x.qcow2] => { "unused_disks": [ "vdb" ] } TASK [set label] *************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:25 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.034) 0:00:11.752 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.059) 0:00:11.812 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.046) 0:00:11.858 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.536) 0:00:12.395 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:46:33 +0000 (0:00:00.073) 0:00:12.468 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.031) 0:00:12.500 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.031) 0:00:12.531 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.098) 0:00:12.629 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.029) 0:00:12.659 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.032) 0:00:12.692 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.032) 0:00:12.725 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_label": "label", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.036) 0:00:12.761 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.033) 0:00:12.794 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.033) 0:00:12.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.031) 0:00:12.858 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.031) 0:00:12.890 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.083) 0:00:12.974 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:46:34 +0000 (0:00:00.035) 0:00:13.010 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "label", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:46:36 +0000 (0:00:01.886) 0:00:14.896 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:46:36 +0000 (0:00:00.034) 0:00:14.931 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:46:36 +0000 (0:00:00.031) 0:00:14.963 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/vdb", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "label", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:46:36 +0000 (0:00:00.041) 0:00:15.005 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:46:36 +0000 (0:00:00.036) 0:00:15.041 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "label", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:46:36 +0000 (0:00:00.038) 0:00:15.080 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:46:36 +0000 (0:00:00.033) 0:00:15.114 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:46:37 +0000 (0:00:00.994) 0:00:16.109 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=df87a08e-8181-4b0a-a60c-803ca76b3721', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:46:38 +0000 (0:00:00.594) 0:00:16.704 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:46:38 +0000 (0:00:00.706) 0:00:17.411 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:46:39 +0000 (0:00:00.432) 0:00:17.843 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:46:39 +0000 (0:00:00.030) 0:00:17.874 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:37 Wednesday 01 June 2022 17:46:40 +0000 (0:00:00.895) 0:00:18.770 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:46:40 +0000 (0:00:00.053) 0:00:18.824 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:46:40 +0000 (0:00:00.030) 0:00:18.855 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "label", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:46:40 +0000 (0:00:00.039) 0:00:18.894 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "ext4", "label": "label", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:46:40 +0000 (0:00:00.526) 0:00:19.421 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002994", "end": "2022-06-01 13:46:40.708410", "rc": 0, "start": "2022-06-01 13:46:40.705416" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=df87a08e-8181-4b0a-a60c-803ca76b3721 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:46:41 +0000 (0:00:00.490) 0:00:19.911 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003094", "end": "2022-06-01 13:46:41.121343", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:46:41.118249" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:46:41 +0000 (0:00:00.411) 0:00:20.322 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:46:41 +0000 (0:00:00.030) 0:00:20.353 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:46:41 +0000 (0:00:00.030) 0:00:20.383 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:46:41 +0000 (0:00:00.063) 0:00:20.446 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.038) 0:00:20.485 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.126) 0:00:20.611 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.038) 0:00:20.650 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/vdb", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/vdb", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.046) 0:00:20.696 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.038) 0:00:20.735 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.036) 0:00:20.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.042) 0:00:20.814 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.030) 0:00:20.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.030) 0:00:20.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.029) 0:00:20.903 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.032) 0:00:20.936 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.049) 0:00:20.985 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.041) 0:00:21.027 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.044) 0:00:21.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.035) 0:00:21.107 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.035) 0:00:21.143 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.043) 0:00:21.186 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:46:42 +0000 (0:00:00.041) 0:00:21.228 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105595.6151216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105595.6151216, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105595.6151216, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.398) 0:00:21.626 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.038) 0:00:21.665 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.038) 0:00:21.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.035) 0:00:21.739 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.031) 0:00:21.771 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.037) 0:00:21.808 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.031) 0:00:21.840 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.034) 0:00:21.874 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.032) 0:00:21.907 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.039) 0:00:21.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.032) 0:00:21.978 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.031) 0:00:22.010 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.031) 0:00:22.042 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.078) 0:00:22.120 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.033) 0:00:22.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.039) 0:00:22.193 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.036) 0:00:22.230 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.031) 0:00:22.261 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.033) 0:00:22.294 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.035) 0:00:22.330 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.032) 0:00:22.362 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.030) 0:00:22.392 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.030) 0:00:22.423 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:46:43 +0000 (0:00:00.031) 0:00:22.454 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:22.487 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.034) 0:00:22.522 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.030) 0:00:22.553 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.030) 0:00:22.584 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:22.616 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:22.649 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.038) 0:00:22.687 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.046) 0:00:22.734 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.033) 0:00:22.768 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.033) 0:00:22.801 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:22.834 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.031) 0:00:22.865 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.031) 0:00:22.897 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.040) 0:00:22.937 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.036) 0:00:22.974 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.031) 0:00:23.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:23.038 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:23.071 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.033) 0:00:23.104 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.033) 0:00:23.138 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.031) 0:00:23.169 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:23.202 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.031) 0:00:23.234 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.031) 0:00:23.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [relabel] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:39 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.032) 0:00:23.298 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.069) 0:00:23.368 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:46:44 +0000 (0:00:00.045) 0:00:23.413 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.546) 0:00:23.960 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.074) 0:00:24.034 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.032) 0:00:24.067 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.031) 0:00:24.099 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.063) 0:00:24.162 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.028) 0:00:24.190 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.030) 0:00:24.221 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.031) 0:00:24.252 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_label": "relabel", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.037) 0:00:24.290 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.032) 0:00:24.323 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.031) 0:00:24.355 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.030) 0:00:24.385 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.029) 0:00:24.415 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:46:45 +0000 (0:00:00.043) 0:00:24.458 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:46:46 +0000 (0:00:00.032) 0:00:24.491 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "configure format", "device": "/dev/vdb", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:46:47 +0000 (0:00:01.807) 0:00:26.298 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:46:47 +0000 (0:00:00.035) 0:00:26.334 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:46:47 +0000 (0:00:00.037) 0:00:26.371 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "configure format", "device": "/dev/vdb", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" } ], "packages": [ "dosfstools", "mdadm", "e2fsprogs", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:46:47 +0000 (0:00:00.044) 0:00:26.416 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:46:47 +0000 (0:00:00.040) 0:00:26.456 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:46:48 +0000 (0:00:00.038) 0:00:26.495 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:46:48 +0000 (0:00:00.034) 0:00:26.529 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:46:48 +0000 (0:00:00.691) 0:00:27.221 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=df87a08e-8181-4b0a-a60c-803ca76b3721', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:46:49 +0000 (0:00:00.409) 0:00:27.630 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:46:49 +0000 (0:00:00.672) 0:00:28.302 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:46:50 +0000 (0:00:00.398) 0:00:28.701 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:46:50 +0000 (0:00:00.030) 0:00:28.731 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:51 Wednesday 01 June 2022 17:46:51 +0000 (0:00:00.876) 0:00:29.607 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:46:51 +0000 (0:00:00.062) 0:00:29.669 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:46:51 +0000 (0:00:00.032) 0:00:29.702 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:46:51 +0000 (0:00:00.039) 0:00:29.741 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "ext4", "label": "relabel", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:46:51 +0000 (0:00:00.436) 0:00:30.178 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002983", "end": "2022-06-01 13:46:51.369781", "rc": 0, "start": "2022-06-01 13:46:51.366798" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=df87a08e-8181-4b0a-a60c-803ca76b3721 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.391) 0:00:30.569 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003115", "end": "2022-06-01 13:46:51.775010", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:46:51.771895" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.406) 0:00:30.976 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.031) 0:00:31.008 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.095) 0:00:31.104 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.091) 0:00:31.195 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.044) 0:00:31.240 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.129) 0:00:31.370 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.041) 0:00:31.412 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/vdb", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/vdb", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:46:52 +0000 (0:00:00.052) 0:00:31.464 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.042) 0:00:31.507 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.037) 0:00:31.545 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.040) 0:00:31.585 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.033) 0:00:31.618 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.032) 0:00:31.650 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.034) 0:00:31.685 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.033) 0:00:31.718 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.048) 0:00:31.766 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.035) 0:00:31.802 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.038) 0:00:31.841 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.032) 0:00:31.874 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.036) 0:00:31.910 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.039) 0:00:31.949 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.039) 0:00:31.989 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105606.9531214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105606.9531214, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105606.9531214, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.396) 0:00:32.385 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.039) 0:00:32.424 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:46:53 +0000 (0:00:00.037) 0:00:32.461 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.046) 0:00:32.507 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.031) 0:00:32.539 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.037) 0:00:32.577 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.032) 0:00:32.610 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.033) 0:00:32.643 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.032) 0:00:32.676 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.040) 0:00:32.716 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.033) 0:00:32.750 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.030) 0:00:32.781 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.031) 0:00:32.813 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.031) 0:00:32.844 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.029) 0:00:32.874 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.035) 0:00:32.909 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.037) 0:00:32.946 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.030) 0:00:32.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.034) 0:00:33.011 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.031) 0:00:33.043 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.049) 0:00:33.092 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.052) 0:00:33.144 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.042) 0:00:33.186 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.035) 0:00:33.222 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.032) 0:00:33.255 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.034) 0:00:33.289 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.033) 0:00:33.322 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.033) 0:00:33.356 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.035) 0:00:33.391 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.033) 0:00:33.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:46:54 +0000 (0:00:00.034) 0:00:33.459 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.038) 0:00:33.497 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.033) 0:00:33.531 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.033) 0:00:33.565 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.080) 0:00:33.645 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.033) 0:00:33.679 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.034) 0:00:33.713 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.034) 0:00:33.748 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.032) 0:00:33.780 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.031) 0:00:33.812 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.033) 0:00:33.845 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.031) 0:00:33.877 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.030) 0:00:33.907 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.032) 0:00:33.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.032) 0:00:33.972 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.031) 0:00:34.004 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.034) 0:00:34.039 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.033) 0:00:34.072 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [run relabel again to verify idempotence] ********************************* task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:53 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.031) 0:00:34.104 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.070) 0:00:34.174 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:46:55 +0000 (0:00:00.045) 0:00:34.220 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.552) 0:00:34.773 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.072) 0:00:34.846 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.034) 0:00:34.880 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.032) 0:00:34.913 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.062) 0:00:34.975 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.030) 0:00:35.006 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.034) 0:00:35.041 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.038) 0:00:35.079 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "fs_label": "relabel", "fs_type": "ext4", "mount_point": "/opt/test1", "name": "test1", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.040) 0:00:35.119 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.034) 0:00:35.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.033) 0:00:35.188 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.033) 0:00:35.221 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.039) 0:00:35.261 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.052) 0:00:35.314 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:46:56 +0000 (0:00:00.031) 0:00:35.346 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" } ], "packages": [ "xfsprogs", "mdadm", "e2fsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:46:58 +0000 (0:00:01.608) 0:00:36.954 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:46:58 +0000 (0:00:00.033) 0:00:36.988 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:46:58 +0000 (0:00:00.031) 0:00:37.019 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" } ], "packages": [ "xfsprogs", "mdadm", "e2fsprogs", "dosfstools" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:46:58 +0000 (0:00:00.041) 0:00:37.060 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:46:58 +0000 (0:00:00.036) 0:00:37.096 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:46:58 +0000 (0:00:00.040) 0:00:37.137 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:46:58 +0000 (0:00:00.035) 0:00:37.173 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:46:59 +0000 (0:00:00.695) 0:00:37.869 ******** ok: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=df87a08e-8181-4b0a-a60c-803ca76b3721', u'state': u'mounted', u'dump': 0, u'path': u'/opt/test1', u'passno': 0, u'opts': u'defaults', u'fstype': u'ext4'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:46:59 +0000 (0:00:00.405) 0:00:38.274 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:47:00 +0000 (0:00:00.690) 0:00:38.965 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:47:00 +0000 (0:00:00.398) 0:00:39.363 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:47:00 +0000 (0:00:00.030) 0:00:39.394 ******** ok: [/cache/rhel-x.qcow2] TASK [check for idempotency] *************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:65 Wednesday 01 June 2022 17:47:01 +0000 (0:00:00.872) 0:00:40.266 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:70 Wednesday 01 June 2022 17:47:01 +0000 (0:00:00.037) 0:00:40.304 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:47:01 +0000 (0:00:00.061) 0:00:40.365 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:47:01 +0000 (0:00:00.031) 0:00:40.396 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_kernel_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "_raw_kernel_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "relabel", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:47:01 +0000 (0:00:00.040) 0:00:40.436 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "ext4", "label": "relabel", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:47:02 +0000 (0:00:00.431) 0:00:40.868 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002982", "end": "2022-06-01 13:47:02.082784", "rc": 0, "start": "2022-06-01 13:47:02.079802" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=df87a08e-8181-4b0a-a60c-803ca76b3721 /opt/test1 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:47:02 +0000 (0:00:00.418) 0:00:41.287 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002961", "end": "2022-06-01 13:47:02.497695", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:47:02.494734" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.411) 0:00:41.698 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.029) 0:00:41.727 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.033) 0:00:41.760 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.067) 0:00:41.828 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.106) 0:00:41.934 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.124) 0:00:42.058 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.041) 0:00:42.099 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/vdb", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2419519, "block_size": 4096, "block_total": 2554693, "block_used": 135174, "device": "/dev/vdb", "fstype": "ext4", "inode_available": 655349, "inode_total": 655360, "inode_used": 11, "mount": "/opt/test1", "options": "rw,seclabel,relatime", "size_available": 9910349824, "size_total": 10464022528, "uuid": "df87a08e-8181-4b0a-a60c-803ca76b3721" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.061) 0:00:42.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.039) 0:00:42.200 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.037) 0:00:42.238 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.039) 0:00:42.277 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.034) 0:00:42.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.030) 0:00:42.342 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.030) 0:00:42.373 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.031) 0:00:42.405 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:47:03 +0000 (0:00:00.050) 0:00:42.455 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.039) 0:00:42.494 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.039) 0:00:42.533 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.032) 0:00:42.565 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.032) 0:00:42.597 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.039) 0:00:42.637 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.040) 0:00:42.678 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105606.9531214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105606.9531214, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105606.9531214, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.398) 0:00:43.077 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.039) 0:00:43.116 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.036) 0:00:43.153 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.032) 0:00:43.185 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.029) 0:00:43.214 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.036) 0:00:43.251 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.033) 0:00:43.284 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.034) 0:00:43.319 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.031) 0:00:43.351 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.037) 0:00:43.388 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.030) 0:00:43.419 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:47:04 +0000 (0:00:00.032) 0:00:43.451 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.031) 0:00:43.482 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.034) 0:00:43.516 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.031) 0:00:43.548 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.040) 0:00:43.589 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.038) 0:00:43.627 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.035) 0:00:43.662 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.035) 0:00:43.698 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.036) 0:00:43.734 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.036) 0:00:43.770 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.035) 0:00:43.806 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.036) 0:00:43.842 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.032) 0:00:43.875 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.030) 0:00:43.906 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.034) 0:00:43.940 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.035) 0:00:43.976 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.032) 0:00:44.009 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.032) 0:00:44.041 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.033) 0:00:44.074 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.032) 0:00:44.107 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.037) 0:00:44.145 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.032) 0:00:44.177 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.030) 0:00:44.208 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.033) 0:00:44.241 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.033) 0:00:44.275 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.034) 0:00:44.309 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.038) 0:00:44.348 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.094) 0:00:44.442 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:47:05 +0000 (0:00:00.033) 0:00:44.476 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.032) 0:00:44.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.031) 0:00:44.540 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.030) 0:00:44.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.032) 0:00:44.602 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.034) 0:00:44.637 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.031) 0:00:44.669 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.035) 0:00:44.704 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.030) 0:00:44.734 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:72 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.032) 0:00:44.767 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.083) 0:00:44.851 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.047) 0:00:44.898 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:47:06 +0000 (0:00:00.577) 0:00:45.475 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.077) 0:00:45.552 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.041) 0:00:45.594 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.043) 0:00:45.638 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.070) 0:00:45.708 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.027) 0:00:45.736 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.033) 0:00:45.769 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.036) 0:00:45.806 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": [ { "disks": [ "vdb" ], "mount_point": "/opt/test1", "name": "test1", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.041) 0:00:45.847 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.034) 0:00:45.882 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.033) 0:00:45.915 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.037) 0:00:45.953 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.032) 0:00:45.986 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.050) 0:00:46.037 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:47:07 +0000 (0:00:00.034) 0:00:46.071 ******** changed: [/cache/rhel-x.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/vdb", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:47:09 +0000 (0:00:01.811) 0:00:47.883 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:47:09 +0000 (0:00:00.035) 0:00:47.918 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:47:09 +0000 (0:00:00.028) 0:00:47.947 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/vdb", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/md/test1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "absent" } ], "packages": [ "dosfstools", "xfsprogs", "mdadm" ], "pools": [], "volumes": [ { "_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:47:09 +0000 (0:00:00.037) 0:00:47.985 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:47:09 +0000 (0:00:00.039) 0:00:48.025 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:47:09 +0000 (0:00:00.088) 0:00:48.114 ******** changed: [/cache/rhel-x.qcow2] => (item={u'src': u'UUID=df87a08e-8181-4b0a-a60c-803ca76b3721', u'state': u'absent', u'fstype': u'ext4', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test1", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:47:10 +0000 (0:00:00.415) 0:00:48.530 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:47:10 +0000 (0:00:00.697) 0:00:49.228 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:47:10 +0000 (0:00:00.033) 0:00:49.261 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:47:11 +0000 (0:00:00.681) 0:00:49.943 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:47:11 +0000 (0:00:00.411) 0:00:50.354 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:47:11 +0000 (0:00:00.029) 0:00:50.383 ******** ok: [/cache/rhel-x.qcow2] TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:83 Wednesday 01 June 2022 17:47:12 +0000 (0:00:00.862) 0:00:51.246 ******** included: /tmp/tmp7247_7fr/tests/verify-role-results.yml for /cache/rhel-x.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:1 Wednesday 01 June 2022 17:47:12 +0000 (0:00:00.063) 0:00:51.310 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:6 Wednesday 01 June 2022 17:47:12 +0000 (0:00:00.035) 0:00:51.345 ******** ok: [/cache/rhel-x.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/vdb", "_mount_id": "UUID=df87a08e-8181-4b0a-a60c-803ca76b3721", "_raw_device": "/dev/vdb", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "vdb" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:14 Wednesday 01 June 2022 17:47:12 +0000 (0:00:00.055) 0:00:51.401 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "info": { "/dev/md/test1": { "fstype": "xfs", "label": "", "name": "/dev/md/test1", "size": "10G", "type": "raid1", "uuid": "3517ecef-b308-482c-8907-7eb83fc04137" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "linux_raid_member", "label": "test1", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "cea8517f-119e-6117-5a2d-ac1a99857af3" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "368K", "type": "rom", "uuid": "2022-06-01-16-20-09-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "vfat", "label": "", "name": "/dev/vda2", "size": "200M", "type": "partition", "uuid": "7B77-95E7" }, "/dev/vda3": { "fstype": "xfs", "label": "boot", "name": "/dev/vda3", "size": "500M", "type": "partition", "uuid": "75adb680-1c10-4e41-8d90-7dd03373b3f7" }, "/dev/vda4": { "fstype": "xfs", "label": "root", "name": "/dev/vda4", "size": "9.3G", "type": "partition", "uuid": "e80c0e7b-21c5-4cd9-8217-a5180dc70345" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:19 Wednesday 01 June 2022 17:47:13 +0000 (0:00:00.417) 0:00:51.819 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003467", "end": "2022-06-01 13:47:13.010168", "rc": 0, "start": "2022-06-01 13:47:13.006701" } STDOUT: UUID=75adb680-1c10-4e41-8d90-7dd03373b3f7 /boot xfs defaults 0 0 UUID=e80c0e7b-21c5-4cd9-8217-a5180dc70345 / xfs defaults 0 0 UUID=7B77-95E7 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:24 Wednesday 01 June 2022 17:47:13 +0000 (0:00:00.395) 0:00:52.214 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003555", "end": "2022-06-01 13:47:13.425242", "failed_when_result": false, "rc": 0, "start": "2022-06-01 13:47:13.421687" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:33 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.414) 0:00:52.629 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:40 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.029) 0:00:52.659 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_pool": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:47 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.030) 0:00:52.689 ******** [WARNING]: The loop variable 'storage_test_volume' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. included: /tmp/tmp7247_7fr/tests/test-verify-volume.yml for /cache/rhel-x.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:2 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.115) 0:00:52.804 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:10 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.038) 0:00:52.843 ******** included: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml for /cache/rhel-x.qcow2 included: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml for /cache/rhel-x.qcow2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:6 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.116) 0:00:52.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/vdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:14 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.034) 0:00:52.994 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:28 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.039) 0:00:53.033 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:37 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.030) 0:00:53.063 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:45 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.036) 0:00:53.099 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:54 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.034) 0:00:53.134 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:58 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.034) 0:00:53.168 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:63 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.031) 0:00:53.200 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-mount.yml:75 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.032) 0:00:53.233 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:2 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.031) 0:00:53.265 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:25 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.047) 0:00:53.312 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:32 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.029) 0:00:53.341 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:39 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.037) 0:00:53.379 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fstab.yml:49 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.029) 0:00:53.409 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:4 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.030) 0:00:53.439 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-fs.yml:10 Wednesday 01 June 2022 17:47:14 +0000 (0:00:00.031) 0:00:53.471 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:4 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.026) 0:00:53.497 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654105628.6081214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1654105628.6081214, "dev": 5, "device_type": 64528, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1654105628.6081214, "nlink": 1, "path": "/dev/vdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:10 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.393) 0:00:53.890 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:18 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.040) 0:00:53.931 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:24 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.027) 0:00:53.959 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:28 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.035) 0:00:53.994 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-device.yml:33 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.032) 0:00:54.026 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:3 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.027) 0:00:54.054 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:10 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.032) 0:00:54.086 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:16 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.035) 0:00:54.122 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:25 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.031) 0:00:54.154 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:33 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.035) 0:00:54.189 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:39 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.034) 0:00:54.223 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:44 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.033) 0:00:54.257 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:50 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.034) 0:00:54.291 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:56 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.037) 0:00:54.328 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:62 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.034) 0:00:54.363 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:69 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.045) 0:00:54.408 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:74 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.037) 0:00:54.446 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:80 Wednesday 01 June 2022 17:47:15 +0000 (0:00:00.032) 0:00:54.478 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:86 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.030) 0:00:54.509 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-encryption.yml:92 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.031) 0:00:54.540 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:7 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.030) 0:00:54.570 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:13 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.033) 0:00:54.604 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:17 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.034) 0:00:54.639 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:21 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.033) 0:00:54.672 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:25 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.032) 0:00:54.705 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:31 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.032) 0:00:54.737 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-md.yml:37 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.029) 0:00:54.767 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:3 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.029) 0:00:54.797 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:9 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.029) 0:00:54.827 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:15 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.030) 0:00:54.857 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:20 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.031) 0:00:54.888 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:25 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.038) 0:00:54.927 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:28 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.034) 0:00:54.962 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:31 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.035) 0:00:54.997 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:36 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.034) 0:00:55.032 ******** skipping: [/cache/rhel-x.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:39 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.034) 0:00:55.066 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:44 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.033) 0:00:55.100 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:47 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.101) 0:00:55.201 ******** ok: [/cache/rhel-x.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-size.yml:50 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.039) 0:00:55.240 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:6 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.032) 0:00:55.273 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:14 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.031) 0:00:55.305 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:17 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.028) 0:00:55.334 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:22 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.030) 0:00:55.365 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:26 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.031) 0:00:55.396 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:32 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.029) 0:00:55.425 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume-cache.yml:36 Wednesday 01 June 2022 17:47:16 +0000 (0:00:00.032) 0:00:55.458 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmp7247_7fr/tests/test-verify-volume.yml:16 Wednesday 01 June 2022 17:47:17 +0000 (0:00:00.032) 0:00:55.490 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmp7247_7fr/tests/verify-role-results.yml:57 Wednesday 01 June 2022 17:47:17 +0000 (0:00:00.032) 0:00:55.522 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null, "storage_test_volume": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=260 changed=5 unreachable=0 failed=0 skipped=229 rescued=0 ignored=0 Wednesday 01 June 2022 17:47:17 +0000 (0:00:00.017) 0:00:55.540 ******** =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.89s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get service facts -------------------------- 1.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.81s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.61s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 set up internal repositories -------------------------------------------- 1.36s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- linux-system-roles.storage : make sure blivet is available -------------- 1.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 1.11s /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:2 ----------------------------- linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.99s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : make sure required packages are installed --- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : Update facts ------------------------------- 0.90s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.88s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.87s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 0.86s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : get required packages ---------------------- 0.72s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.71s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.70s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:47:17 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:47:19 +0000 (0:00:01.341) 0:00:01.364 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_volume_relabel_nvme_generated.yml ****************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_volume_relabel_nvme_generated.yml PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY [all] ********************************************************************* META: ran handlers META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:47:19 +0000 (0:00:00.017) 0:00:01.382 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.34s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, May 27 2022, 11:27:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: rhel-x_setup.yml ***************************************************** 1 plays in /cache/rhel-x_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-x_setup.yml:6 Wednesday 01 June 2022 17:47:19 +0000 (0:00:00.022) 0:00:00.022 ******** ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [/cache/rhel-x.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 01 June 2022 17:47:21 +0000 (0:00:01.317) 0:00:01.340 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- PLAYBOOK: tests_volume_relabel_scsi_generated.yml ****************************** 2 plays in /tmp/tmp7247_7fr/tests/tests_volume_relabel_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel_scsi_generated.yml:3 Wednesday 01 June 2022 17:47:21 +0000 (0:00:00.016) 0:00:01.356 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel_scsi_generated.yml:7 Wednesday 01 June 2022 17:47:22 +0000 (0:00:01.155) 0:00:02.512 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:2 Wednesday 01 June 2022 17:47:22 +0000 (0:00:00.027) 0:00:02.540 ******** ok: [/cache/rhel-x.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:10 Wednesday 01 June 2022 17:47:23 +0000 (0:00:00.878) 0:00:03.419 ******** TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Wednesday 01 June 2022 17:47:23 +0000 (0:00:00.042) 0:00:03.462 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Wednesday 01 June 2022 17:47:23 +0000 (0:00:00.166) 0:00:03.628 ******** ok: [/cache/rhel-x.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Wednesday 01 June 2022 17:47:24 +0000 (0:00:00.531) 0:00:04.160 ******** skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-x.qcow2] => (item=RedHat_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs" ] }, "ansible_included_var_files": [ "/tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml" } skipping: [/cache/rhel-x.qcow2] => (item=RedHat_9.1.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_9.1.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Wednesday 01 June 2022 17:47:24 +0000 (0:00:00.078) 0:00:04.239 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Wednesday 01 June 2022 17:47:24 +0000 (0:00:00.026) 0:00:04.266 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Wednesday 01 June 2022 17:47:24 +0000 (0:00:00.025) 0:00:04.291 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Wednesday 01 June 2022 17:47:24 +0000 (0:00:00.196) 0:00:04.488 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Wednesday 01 June 2022 17:47:24 +0000 (0:00:00.019) 0:00:04.508 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Wednesday 01 June 2022 17:47:25 +0000 (0:00:01.101) 0:00:05.609 ******** ok: [/cache/rhel-x.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Wednesday 01 June 2022 17:47:25 +0000 (0:00:00.048) 0:00:05.658 ******** ok: [/cache/rhel-x.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Wednesday 01 June 2022 17:47:25 +0000 (0:00:00.047) 0:00:05.705 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Wednesday 01 June 2022 17:47:26 +0000 (0:00:00.694) 0:00:06.400 ******** included: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-x.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Wednesday 01 June 2022 17:47:26 +0000 (0:00:00.082) 0:00:06.482 ******** TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Wednesday 01 June 2022 17:47:26 +0000 (0:00:00.023) 0:00:06.506 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Wednesday 01 June 2022 17:47:26 +0000 (0:00:00.023) 0:00:06.529 ******** TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Wednesday 01 June 2022 17:47:26 +0000 (0:00:00.024) 0:00:06.554 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Wednesday 01 June 2022 17:47:27 +0000 (0:00:00.846) 0:00:07.400 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cockpit-motd.service": { "name": "cockpit-motd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-http.service": { "name": "cockpit-wsinstance-http.service", "source": "systemd", "state": "inactive", "status": "static" }, "cockpit-wsinstance-https-factory@.service": { "name": "cockpit-wsinstance-https-factory@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit-wsinstance-https@.service": { "name": "cockpit-wsinstance-https@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cockpit.service": { "name": "cockpit.service", "source": "systemd", "state": "inactive", "status": "static" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "insights-client-boot.service": { "name": "insights-client-boot.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "insights-client-results.service": { "name": "insights-client-results.service", "source": "systemd", "state": "inactive", "status": "static" }, "insights-client.service": { "name": "insights-client.service", "source": "systemd", "state": "inactive", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "packagekit-offline-update.service": { "name": "packagekit-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "packagekit.service": { "name": "packagekit.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-fsck@dev-vda2.service": { "name": "systemd-fsck@dev-vda2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Wednesday 01 June 2022 17:47:29 +0000 (0:00:01.839) 0:00:09.240 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.047) 0:00:09.287 ******** TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.031) 0:00:09.319 ******** ok: [/cache/rhel-x.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.546) 0:00:09.866 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.031) 0:00:09.898 ******** TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.029) 0:00:09.927 ******** ok: [/cache/rhel-x.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.036) 0:00:09.963 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.037) 0:00:10.001 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.035) 0:00:10.037 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Wednesday 01 June 2022 17:47:29 +0000 (0:00:00.031) 0:00:10.068 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Wednesday 01 June 2022 17:47:30 +0000 (0:00:00.032) 0:00:10.100 ******** TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Wednesday 01 June 2022 17:47:30 +0000 (0:00:00.029) 0:00:10.130 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Wednesday 01 June 2022 17:47:30 +0000 (0:00:00.031) 0:00:10.161 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "stat": { "atime": 1654103529.7011216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1654103527.5841215, "dev": 64516, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 25792400, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1654103527.5821216, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4245553602", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Wednesday 01 June 2022 17:47:30 +0000 (0:00:00.528) 0:00:10.689 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Wednesday 01 June 2022 17:47:30 +0000 (0:00:00.028) 0:00:10.718 ******** ok: [/cache/rhel-x.qcow2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:13 Wednesday 01 June 2022 17:47:31 +0000 (0:00:00.852) 0:00:11.570 ******** ok: [/cache/rhel-x.qcow2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:20 Wednesday 01 June 2022 17:47:31 +0000 (0:00:00.033) 0:00:11.604 ******** included: /tmp/tmp7247_7fr/tests/get_unused_disk.yml for /cache/rhel-x.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 Wednesday 01 June 2022 17:47:31 +0000 (0:00:00.046) 0:00:11.651 ******** ok: [/cache/rhel-x.qcow2] => { "changed": false, "disks": "Unable to find unused disk" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:9 Wednesday 01 June 2022 17:47:32 +0000 (0:00:00.552) 0:00:12.203 ******** skipping: [/cache/rhel-x.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmp7247_7fr/tests/get_unused_disk.yml:14 Wednesday 01 June 2022 17:47:32 +0000 (0:00:00.030) 0:00:12.233 ******** fatal: [/cache/rhel-x.qcow2]: FAILED! => { "changed": false } MSG: Unable to find enough unused disks. Exiting playbook. PLAY RECAP ********************************************************************* /cache/rhel-x.qcow2 : ok=27 changed=0 unreachable=0 failed=1 skipped=13 rescued=0 ignored=0 Wednesday 01 June 2022 17:47:32 +0000 (0:00:00.020) 0:00:12.254 ******** =============================================================================== linux-system-roles.storage : get service facts -------------------------- 1.84s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 set up internal repositories -------------------------------------------- 1.32s /cache/rhel-x_setup.yml:6 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.16s /tmp/tmp7247_7fr/tests/tests_volume_relabel_scsi_generated.yml:3 -------------- linux-system-roles.storage : make sure blivet is available -------------- 1.10s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Gathering Facts --------------------------------------------------------- 0.88s /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:2 ----------------------------- linux-system-roles.storage : Update facts ------------------------------- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : make sure required packages are installed --- 0.85s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 linux-system-roles.storage : get required packages ---------------------- 0.69s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Find unused disks in the system ----------------------------------------- 0.55s /tmp/tmp7247_7fr/tests/get_unused_disk.yml:2 ---------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 0.55s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Ensure ansible_facts used by role ---------- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 -- linux-system-roles.storage : retrieve facts for the /etc/crypttab file --- 0.53s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 linux-system-roles.storage : include the appropriate provider tasks ----- 0.20s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:13 ----- linux-system-roles.storage : set platform/version specific variables ---- 0.17s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main.yml:2 ------ linux-system-roles.storage : enable copr repositories if needed --------- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 linux-system-roles.storage : Set platform/version specific variables ---- 0.08s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 -- linux-system-roles.storage : show storage_pools ------------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 linux-system-roles.storage : show storage_volumes ----------------------- 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 linux-system-roles.storage : Set storage_cryptsetup_services ------------ 0.05s /tmp/tmp7247_7fr/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 include_tasks ----------------------------------------------------------- 0.05s /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml:20 ---------------------------- Testing /tmp/tmp7247_7fr/tests/tests_change_disk_fs.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_disk_fs_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_disk_fs_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_disk_mount.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_disk_mount_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_disk_mount_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_fs.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_fs_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_fs_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_fs_use_partitions_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_mount.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_mount_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_change_mount_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_disk_then_remove_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lv_size_equal_to_vg_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvm_cache_then_remove_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvm_pool_then_remove_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_lvmvdo_then_remove_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_partition_volume_then_remove_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_raid_pool_then_remove_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_create_raid_volume_then_remove_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_default.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_default_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_default_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_deps.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_deps_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_deps_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_disk_errors.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_disk_errors_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_disk_errors_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_existing_lvm_pool_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_cache_volume_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_raid_pool_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_fatals_raid_volume_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_filesystem_one_disk_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_include_vars_from_parent_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_luks.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_luks_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_luks_pool.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_luks_pool_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_luks_pool_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_luks_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_auto_size_cap_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_errors.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_errors_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_errors_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_multiple_disks_multiple_volumes_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_multiple_volumes_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_one_disk_one_volume_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_percent_size.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_percent_size_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_lvm_percent_size_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_misc.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_misc_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_misc_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_missing_volume_type_in_pool_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_null_raid_pool.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_null_raid_pool_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_null_raid_pool_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_raid_pool_options.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_raid_pool_options_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_raid_pool_options_scsi_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_raid_volume_options.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_raid_volume_options_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_raid_volume_options_scsi_generated.yml...FAILURE Testing /tmp/tmp7247_7fr/tests/tests_remove_mount.yml...FAILURE Testing /tmp/tmp7247_7fr/tests/tests_remove_mount_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_remove_mount_scsi_generated.yml...FAILURE Testing /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_remove_nonexistent_pool_scsi_generated.yml...FAILURE Testing /tmp/tmp7247_7fr/tests/tests_resize.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_resize_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_resize_scsi_generated.yml...FAILURE Testing /tmp/tmp7247_7fr/tests/tests_safe_mode_check.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_safe_mode_check_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_safe_mode_check_scsi_generated.yml...FAILURE Testing /tmp/tmp7247_7fr/tests/tests_swap.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_swap_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_swap_scsi_generated.yml...FAILURE Testing /tmp/tmp7247_7fr/tests/tests_volume_relabel.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_volume_relabel_nvme_generated.yml...SUCCESS Testing /tmp/tmp7247_7fr/tests/tests_volume_relabel_scsi_generated.yml...FAILURE FAILURE